Mar 20 10:54:33 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 10:54:33 crc restorecon[4757]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:33 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:34 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:54:35 crc restorecon[4757]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 10:54:36 crc kubenswrapper[4860]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:54:36 crc kubenswrapper[4860]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 10:54:36 crc kubenswrapper[4860]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:54:36 crc kubenswrapper[4860]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:54:36 crc kubenswrapper[4860]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 10:54:36 crc kubenswrapper[4860]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.861021 4860 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901839 4860 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901891 4860 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901898 4860 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901907 4860 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901914 4860 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901920 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901927 4860 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901932 4860 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901939 4860 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901944 4860 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901950 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901957 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901963 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901969 4860 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901975 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901980 4860 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901985 4860 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901991 4860 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.901996 4860 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902001 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902007 4860 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902012 4860 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902017 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902022 4860 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902028 4860 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902033 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902038 4860 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902045 4860 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902053 4860 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902058 4860 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902064 4860 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902069 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902074 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902086 4860 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902091 4860 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902100 4860 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902106 4860 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902112 4860 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902118 4860 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902124 4860 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902129 4860 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902134 4860 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902139 4860 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902144 4860 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902150 4860 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902156 4860 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902161 4860 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902166 4860 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902170 4860 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902176 4860 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902181 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902186 4860 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902191 4860 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902196 4860 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902201 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902207 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902212 4860 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902217 4860 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902246 4860 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902251 4860 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902256 4860 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902262 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902267 4860 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902273 4860 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902291 4860 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902298 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902305 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902311 4860 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902319 4860 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902326 4860 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.902335 4860 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903387 4860 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903410 4860 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903427 4860 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903435 4860 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903443 4860 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903452 4860 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903461 4860 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903470 4860 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903478 4860 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903485 4860 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903492 4860 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903500 4860 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903507 4860 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903515 4860 flags.go:64] FLAG: --cgroup-root="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903521 4860 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903529 4860 flags.go:64] FLAG: --client-ca-file="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903537 4860 flags.go:64] FLAG: --cloud-config="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903545 4860 flags.go:64] FLAG: --cloud-provider="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903554 4860 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903568 4860 flags.go:64] FLAG: --cluster-domain="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903577 4860 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.903587 4860 flags.go:64] FLAG: --config-dir="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904563 4860 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904577 4860 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904586 4860 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904593 4860 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904600 4860 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904607 4860 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904613 4860 flags.go:64] FLAG: --contention-profiling="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904619 4860 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904626 4860 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904632 4860 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904638 4860 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904646 4860 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904653 4860 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904659 4860 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904665 4860 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904671 4860 flags.go:64] FLAG: --enable-server="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904678 4860 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904686 4860 flags.go:64] FLAG: --event-burst="100" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904692 4860 flags.go:64] FLAG: --event-qps="50" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904698 4860 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904705 4860 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904711 4860 flags.go:64] FLAG: --eviction-hard="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904718 4860 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904724 4860 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904731 4860 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904742 4860 flags.go:64] FLAG: --eviction-soft="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904749 4860 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904756 4860 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904763 4860 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904770 4860 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904776 4860 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904782 4860 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904788 4860 flags.go:64] FLAG: --feature-gates="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904797 4860 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904805 4860 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904817 4860 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904832 4860 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904841 4860 flags.go:64] FLAG: --healthz-port="10248" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904849 4860 flags.go:64] FLAG: --help="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904857 4860 flags.go:64] FLAG: --hostname-override="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904865 4860 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904873 4860 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904879 4860 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904885 4860 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904892 4860 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904898 4860 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904905 4860 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904910 4860 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904917 4860 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904923 4860 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904930 4860 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904936 4860 flags.go:64] FLAG: --kube-reserved="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904944 4860 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904950 4860 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904957 4860 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904962 4860 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904969 4860 flags.go:64] FLAG: --lock-file="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904975 4860 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904982 4860 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.904988 4860 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905003 4860 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905011 4860 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905017 4860 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905023 4860 flags.go:64] FLAG: --logging-format="text" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905029 4860 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905036 4860 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905042 4860 flags.go:64] FLAG: --manifest-url="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905048 4860 flags.go:64] FLAG: --manifest-url-header="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905056 4860 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905062 4860 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905070 4860 flags.go:64] FLAG: --max-pods="110" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905076 4860 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905082 4860 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905088 4860 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905095 4860 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905101 4860 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905107 4860 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905113 4860 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905128 4860 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905134 4860 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905141 4860 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905150 4860 flags.go:64] FLAG: --pod-cidr="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905156 4860 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905166 4860 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905172 4860 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905178 4860 flags.go:64] FLAG: --pods-per-core="0" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905184 4860 flags.go:64] FLAG: --port="10250" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905193 4860 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905202 4860 flags.go:64] FLAG: --provider-id="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905211 4860 flags.go:64] FLAG: --qos-reserved="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905222 4860 flags.go:64] FLAG: --read-only-port="10255" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905270 4860 flags.go:64] FLAG: --register-node="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905279 4860 flags.go:64] FLAG: --register-schedulable="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905288 4860 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905347 4860 flags.go:64] FLAG: --registry-burst="10" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905356 4860 flags.go:64] FLAG: --registry-qps="5" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905365 4860 flags.go:64] FLAG: --reserved-cpus="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905376 4860 flags.go:64] FLAG: --reserved-memory="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905388 4860 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905397 4860 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905406 4860 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905415 4860 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905425 4860 flags.go:64] FLAG: --runonce="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905434 4860 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905444 4860 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905454 4860 flags.go:64] FLAG: --seccomp-default="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905463 4860 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905472 4860 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905482 4860 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905491 4860 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905500 4860 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905509 4860 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905518 4860 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905527 4860 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905536 4860 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905546 4860 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905555 4860 flags.go:64] FLAG: --system-cgroups="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905564 4860 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905578 4860 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905587 4860 flags.go:64] FLAG: --tls-cert-file="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905596 4860 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905609 4860 flags.go:64] FLAG: --tls-min-version="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905619 4860 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905628 4860 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905637 4860 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905646 4860 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905655 4860 flags.go:64] FLAG: --v="2" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905666 4860 flags.go:64] FLAG: --version="false" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905678 4860 flags.go:64] FLAG: --vmodule="" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905688 4860 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.905698 4860 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.905987 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906006 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906031 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906049 4860 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906061 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906074 4860 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906085 4860 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906096 4860 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906110 4860 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906122 4860 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906132 4860 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906140 4860 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906149 4860 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906158 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906167 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906175 4860 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906183 4860 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906190 4860 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906199 4860 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906207 4860 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906214 4860 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906222 4860 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906265 4860 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906274 4860 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906282 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906290 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906297 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906308 4860 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906318 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906327 4860 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906335 4860 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906344 4860 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906352 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906363 4860 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906372 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906381 4860 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906389 4860 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906397 4860 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906407 4860 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906415 4860 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906423 4860 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906431 4860 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906438 4860 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906446 4860 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906453 4860 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906461 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906469 4860 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906476 4860 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906484 4860 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906492 4860 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906500 4860 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906508 4860 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906515 4860 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906524 4860 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906533 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906544 4860 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906554 4860 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906565 4860 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906574 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906582 4860 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906592 4860 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906602 4860 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906612 4860 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906622 4860 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906632 4860 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906643 4860 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906652 4860 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906659 4860 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906667 4860 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906674 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.906683 4860 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.906695 4860 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.932192 4860 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.932272 4860 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932365 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932375 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932380 4860 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932386 4860 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932391 4860 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932397 4860 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932401 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932407 4860 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932412 4860 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932417 4860 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932422 4860 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932427 4860 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932432 4860 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932437 4860 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932442 4860 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932448 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932453 4860 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932458 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932463 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932468 4860 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932473 4860 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932478 4860 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932483 4860 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932488 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932494 4860 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932503 4860 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932509 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932514 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932521 4860 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932529 4860 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932535 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932540 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932546 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932551 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932556 4860 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932562 4860 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932567 4860 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932572 4860 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932576 4860 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932582 4860 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932587 4860 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932592 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932598 4860 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932603 4860 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932608 4860 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932613 4860 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932619 4860 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932624 4860 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932629 4860 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932633 4860 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932639 4860 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932645 4860 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932651 4860 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932656 4860 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932661 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932666 4860 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932671 4860 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932675 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932683 4860 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932688 4860 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932693 4860 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932698 4860 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932703 4860 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932708 4860 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932713 4860 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932718 4860 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932723 4860 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932729 4860 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932735 4860 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932740 4860 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932753 4860 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.932762 4860 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932954 4860 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932969 4860 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932975 4860 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932982 4860 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932989 4860 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.932995 4860 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933001 4860 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933007 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933012 4860 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933018 4860 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933023 4860 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933028 4860 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933033 4860 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933039 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933044 4860 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933051 4860 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933057 4860 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933063 4860 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933068 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933074 4860 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933079 4860 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933084 4860 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933089 4860 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933094 4860 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933099 4860 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933104 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933109 4860 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933114 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933119 4860 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933124 4860 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933130 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933136 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933142 4860 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933148 4860 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933166 4860 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933175 4860 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933182 4860 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933188 4860 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933194 4860 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933200 4860 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933206 4860 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933261 4860 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933270 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933275 4860 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933282 4860 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933287 4860 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933294 4860 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933300 4860 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933306 4860 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933312 4860 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933317 4860 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933323 4860 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933329 4860 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933335 4860 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933342 4860 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933347 4860 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933353 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933359 4860 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933364 4860 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933369 4860 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933375 4860 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933381 4860 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933387 4860 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933395 4860 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933402 4860 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933409 4860 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933415 4860 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933420 4860 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933426 4860 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933431 4860 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:54:36 crc kubenswrapper[4860]: W0320 10:54:36.933455 4860 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.933466 4860 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.933747 4860 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 10:54:36 crc kubenswrapper[4860]: E0320 10:54:36.954391 4860 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.960629 4860 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.960775 4860 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.962624 4860 server.go:997] "Starting client certificate rotation" Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.962746 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 10:54:36 crc kubenswrapper[4860]: I0320 10:54:36.962927 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.054369 4860 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.057372 4860 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.057508 4860 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.098571 4860 log.go:25] "Validated CRI v1 runtime API" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.199614 4860 log.go:25] "Validated CRI v1 image API" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.202081 4860 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.207196 4860 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-10-49-35-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.207287 4860 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.237194 4860 manager.go:217] Machine: {Timestamp:2026-03-20 10:54:37.234640394 +0000 UTC m=+1.456001372 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5064a76f-5382-46f7-bae1-fe91bc80db78 BootID:d21bb8ef-2c26-4952-9b24-e8f54bfb6e63 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:da:04:9a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:da:04:9a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b1:1c:a5 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:6b:e9:f9 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:9c:60:c6 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:aa:83:01 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:bb:20:d7 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:a2:33:e0:9b:18:d8 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:2a:1a:6b:5b:9d:a7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.238037 4860 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.238404 4860 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.239046 4860 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.239561 4860 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.239781 4860 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.241124 4860 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.241356 4860 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.242301 4860 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.242456 4860 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.243300 4860 state_mem.go:36] "Initialized new in-memory state store" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.243642 4860 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.246932 4860 kubelet.go:418] "Attempting to sync node with API server" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.247061 4860 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.247164 4860 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.247280 4860 kubelet.go:324] "Adding apiserver pod source" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.247400 4860 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.251732 4860 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.253114 4860 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.255741 4860 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 10:54:37 crc kubenswrapper[4860]: W0320 10:54:37.256885 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.257019 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:37 crc kubenswrapper[4860]: W0320 10:54:37.256889 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.257198 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257125 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257380 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257431 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257478 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257528 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257576 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257622 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257682 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257736 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257781 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257847 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.257898 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.258721 4860 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.259031 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.259408 4860 server.go:1280] "Started kubelet" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.261422 4860 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.259492 4860 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 10:54:37 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.264427 4860 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.265317 4860 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.265393 4860 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.265656 4860 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.265682 4860 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.265869 4860 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.265709 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:54:37 crc kubenswrapper[4860]: W0320 10:54:37.266444 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.266512 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.268117 4860 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.268179 4860 factory.go:55] Registering systemd factory Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.268193 4860 factory.go:221] Registration of the systemd container factory successfully Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.302260 4860 server.go:460] "Adding debug handlers to kubelet server" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.303282 4860 factory.go:153] Registering CRI-O factory Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.303483 4860 factory.go:221] Registration of the crio container factory successfully Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.303504 4860 factory.go:103] Registering Raw factory Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.303520 4860 manager.go:1196] Started watching for new ooms in manager Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.302405 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.201:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e874f2723de82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,LastTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.304634 4860 manager.go:319] Starting recovery of all containers Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.304620 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="200ms" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312451 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312584 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312652 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312688 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312725 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312757 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312775 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312797 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312819 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312865 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312883 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312936 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.312955 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.313050 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.313071 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.313089 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.313139 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.313159 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.313870 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314546 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314587 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314615 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314641 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314672 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314694 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314719 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314748 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314773 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314796 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314906 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314936 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.314968 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315001 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315030 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315058 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315087 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315118 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315147 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315175 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315208 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315275 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315304 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315334 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315358 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315381 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315403 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315424 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315450 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315470 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315492 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315513 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315536 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315570 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315597 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315622 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315651 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315675 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315700 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315722 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315742 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315764 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315784 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315806 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315826 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315845 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315866 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315889 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315912 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315931 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315951 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315972 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.315995 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316012 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316033 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316053 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316072 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316091 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316158 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316179 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316202 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316261 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316292 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316315 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316339 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316360 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316383 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316419 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316486 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316513 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316535 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316556 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316577 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316601 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316623 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316748 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316772 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316795 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316815 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316840 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316862 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316884 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316903 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316927 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316951 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.316982 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317004 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317026 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317050 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317072 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317096 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317164 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317192 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317215 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317303 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317335 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317364 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317393 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317419 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317441 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317463 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317486 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317506 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317524 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317544 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317565 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317586 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317606 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317626 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317646 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317664 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317686 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317709 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317730 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317748 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317768 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317791 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317817 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317847 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317875 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317902 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.317938 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.325948 4860 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326014 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326051 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326082 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326107 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326134 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326156 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326175 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326200 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326218 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326265 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326283 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326299 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326319 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326335 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326350 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326370 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326387 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326408 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326424 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326440 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326459 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326485 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326515 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326537 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326562 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326592 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326616 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326645 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326665 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326684 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326709 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326734 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326760 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326782 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326800 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326827 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326850 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326874 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326893 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326911 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326936 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326958 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.326977 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327000 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327020 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327044 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327065 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327086 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327111 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327133 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327164 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327185 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327205 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327268 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327289 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327317 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327338 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327361 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327388 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327407 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327437 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327463 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327489 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327518 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327539 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327566 4860 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327584 4860 reconstruct.go:97] "Volume reconstruction finished" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.327600 4860 reconciler.go:26] "Reconciler: start to sync state" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.335495 4860 manager.go:324] Recovery completed Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.346134 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.348562 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.348625 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.348643 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.349870 4860 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.349895 4860 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.349926 4860 state_mem.go:36] "Initialized new in-memory state store" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.366323 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.409888 4860 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.411995 4860 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.412062 4860 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.412093 4860 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.412150 4860 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 10:54:37 crc kubenswrapper[4860]: W0320 10:54:37.413015 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.413094 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.467202 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.505880 4860 policy_none.go:49] "None policy: Start" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.506861 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="400ms" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.507784 4860 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.507862 4860 state_mem.go:35] "Initializing new in-memory state store" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.512410 4860 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.557041 4860 manager.go:334] "Starting Device Plugin manager" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.557115 4860 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.557128 4860 server.go:79] "Starting device plugin registration server" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.557734 4860 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.557757 4860 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.557964 4860 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.558163 4860 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.558243 4860 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.568893 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.660585 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.661749 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.661795 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.661808 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.661846 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.662488 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.712826 4860 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.712953 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.714550 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.714621 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.714635 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.714855 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.715320 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.715419 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.716170 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.716206 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.716217 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.716404 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.716546 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.716590 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717159 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717189 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717202 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717621 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717653 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717664 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717711 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717743 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717756 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717758 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717848 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.717873 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.718676 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.718705 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.718713 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.718928 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.719004 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.719016 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.719213 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.719411 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.719482 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.720370 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.720420 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.720436 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.721076 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.721186 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.721494 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.721562 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.721582 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.724811 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.724866 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.724886 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835464 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835533 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835569 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835656 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835711 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835733 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835752 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835770 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.835957 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.836119 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.836196 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.836309 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.836408 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.836512 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.836593 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.862767 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.864909 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.864983 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.865008 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.865060 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.865729 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Mar 20 10:54:37 crc kubenswrapper[4860]: E0320 10:54:37.908957 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="800ms" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938314 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938428 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938466 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938496 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938525 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938624 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938660 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938691 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938726 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938755 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938798 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938834 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938875 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938874 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938921 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938963 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938968 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939018 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938906 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938894 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939058 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.938873 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939139 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939138 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939146 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939171 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939159 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939211 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939345 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:54:37 crc kubenswrapper[4860]: I0320 10:54:37.939177 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.054950 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.079894 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.089435 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.105831 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.111274 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.212685 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-5b8f68d1880d2199dfe0832a4a4b11156fd64cae44a14371af0f6373df9382ff WatchSource:0}: Error finding container 5b8f68d1880d2199dfe0832a4a4b11156fd64cae44a14371af0f6373df9382ff: Status 404 returned error can't find the container with id 5b8f68d1880d2199dfe0832a4a4b11156fd64cae44a14371af0f6373df9382ff Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.220744 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-4040a69d2e69ca1076e30b21da8368c21cffcac5c54a897687c045e66d118762 WatchSource:0}: Error finding container 4040a69d2e69ca1076e30b21da8368c21cffcac5c54a897687c045e66d118762: Status 404 returned error can't find the container with id 4040a69d2e69ca1076e30b21da8368c21cffcac5c54a897687c045e66d118762 Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.225797 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-d26046e0fc04d62fede089c65376ca817da3aa3eb1742f8fcd311078e9176ae7 WatchSource:0}: Error finding container d26046e0fc04d62fede089c65376ca817da3aa3eb1742f8fcd311078e9176ae7: Status 404 returned error can't find the container with id d26046e0fc04d62fede089c65376ca817da3aa3eb1742f8fcd311078e9176ae7 Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.230562 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-b4e0f0c84731c097d29fd9bf509b5837e268a47875b5079a5374a0db41ce730b WatchSource:0}: Error finding container b4e0f0c84731c097d29fd9bf509b5837e268a47875b5079a5374a0db41ce730b: Status 404 returned error can't find the container with id b4e0f0c84731c097d29fd9bf509b5837e268a47875b5079a5374a0db41ce730b Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.231661 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-976163a14d3e179ca944b38f466bf4a70df3e317ec9422624d678378f7eb2c7d WatchSource:0}: Error finding container 976163a14d3e179ca944b38f466bf4a70df3e317ec9422624d678378f7eb2c7d: Status 404 returned error can't find the container with id 976163a14d3e179ca944b38f466bf4a70df3e317ec9422624d678378f7eb2c7d Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.239264 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:38 crc kubenswrapper[4860]: E0320 10:54:38.239402 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.260040 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.266148 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.267318 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.267384 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.267404 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.267465 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:38 crc kubenswrapper[4860]: E0320 10:54:38.268000 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.298722 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:38 crc kubenswrapper[4860]: E0320 10:54:38.298823 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.417567 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b4e0f0c84731c097d29fd9bf509b5837e268a47875b5079a5374a0db41ce730b"} Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.419027 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d26046e0fc04d62fede089c65376ca817da3aa3eb1742f8fcd311078e9176ae7"} Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.420477 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"976163a14d3e179ca944b38f466bf4a70df3e317ec9422624d678378f7eb2c7d"} Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.421912 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4040a69d2e69ca1076e30b21da8368c21cffcac5c54a897687c045e66d118762"} Mar 20 10:54:38 crc kubenswrapper[4860]: I0320 10:54:38.423524 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5b8f68d1880d2199dfe0832a4a4b11156fd64cae44a14371af0f6373df9382ff"} Mar 20 10:54:38 crc kubenswrapper[4860]: E0320 10:54:38.710270 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="1.6s" Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.733202 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:38 crc kubenswrapper[4860]: E0320 10:54:38.733409 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:38 crc kubenswrapper[4860]: W0320 10:54:38.772777 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:38 crc kubenswrapper[4860]: E0320 10:54:38.772908 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:39 crc kubenswrapper[4860]: I0320 10:54:39.068742 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:39 crc kubenswrapper[4860]: I0320 10:54:39.070153 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:39 crc kubenswrapper[4860]: I0320 10:54:39.070216 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:39 crc kubenswrapper[4860]: I0320 10:54:39.070252 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:39 crc kubenswrapper[4860]: I0320 10:54:39.070285 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:39 crc kubenswrapper[4860]: E0320 10:54:39.070793 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Mar 20 10:54:39 crc kubenswrapper[4860]: I0320 10:54:39.088964 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:54:39 crc kubenswrapper[4860]: E0320 10:54:39.090388 4860 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:39 crc kubenswrapper[4860]: I0320 10:54:39.260970 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.260178 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:40 crc kubenswrapper[4860]: E0320 10:54:40.311684 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="3.2s" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.432178 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6bf4a38879e8e3c687bd3c57a3c68a29ad9a9e609ea0cbd220493b6ee4e7d9a3"} Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.434718 4860 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9" exitCode=0 Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.434860 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9"} Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.434939 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.436589 4860 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90" exitCode=0 Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.436691 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90"} Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.436732 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.436964 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.436995 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.437004 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.437899 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4" exitCode=0 Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.437948 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4"} Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.438043 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.438382 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.438452 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.438498 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.440634 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.440687 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.440713 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.440709 4860 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d" exitCode=0 Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.440754 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d"} Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.440837 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.442142 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.442169 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.442180 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.442617 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.443899 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.443949 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.443961 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:40 crc kubenswrapper[4860]: W0320 10:54:40.457580 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:40 crc kubenswrapper[4860]: E0320 10:54:40.457692 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.671814 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.672957 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.672993 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.673005 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:40 crc kubenswrapper[4860]: I0320 10:54:40.673031 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:40 crc kubenswrapper[4860]: E0320 10:54:40.673603 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Mar 20 10:54:40 crc kubenswrapper[4860]: W0320 10:54:40.787524 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:40 crc kubenswrapper[4860]: E0320 10:54:40.787607 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:40 crc kubenswrapper[4860]: W0320 10:54:40.896244 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:40 crc kubenswrapper[4860]: E0320 10:54:40.896331 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.260070 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.446651 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b411b2d78ae0ca6e465eafe2ca565d78630979ffc93ff9fb0785c70d42e4c447"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.446716 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.446734 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4edc7f588e4fdfa1c92fcf94e685925ef7708d48f6dc4a72363331f66f0b4ab7"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.446754 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"37468ccfc2574c3aefd9c88a616ce0f0c69cfa3a56b41c421d53461d4f8b2cc2"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.450599 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.450672 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.450702 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.452952 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.453003 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.453014 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.453133 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.454049 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.454074 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.454084 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.457246 4860 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69" exitCode=0 Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.457365 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.457406 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.458471 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.458513 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.458528 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.464492 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c1e891984b7d45630352f0e37bf11ec8f9f8df49a17c27c868951abcc16e1a89"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.464557 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.464579 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.464590 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.464601 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.464709 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.465837 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.465876 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.465891 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.468708 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.468576 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9696d34484c79734fbd6e1a40f5e4ce6a680b0a67c52cb58ff7a1a1feb8390ed"} Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.469632 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.469688 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:41 crc kubenswrapper[4860]: I0320 10:54:41.469702 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:41 crc kubenswrapper[4860]: W0320 10:54:41.500608 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 20 10:54:41 crc kubenswrapper[4860]: E0320 10:54:41.500720 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.474644 4860 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691" exitCode=0 Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.474759 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691"} Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.474800 4860 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.474838 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.474861 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.474784 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.474937 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.475028 4860 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.475141 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477088 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477140 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477155 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477533 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477594 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477621 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477887 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477915 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477927 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477940 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477956 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477971 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477982 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477988 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.477996 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:42 crc kubenswrapper[4860]: I0320 10:54:42.638936 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.111187 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.437752 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.445648 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.483375 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78"} Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.483478 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63"} Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.483505 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444"} Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.483523 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.483524 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934"} Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.483658 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.483520 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.484640 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.484692 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.484709 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.485301 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.485334 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.485346 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.874051 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.876088 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.876201 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.876290 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:43 crc kubenswrapper[4860]: I0320 10:54:43.876350 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.076190 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.076460 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.078059 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.078139 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.078158 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.368014 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.491316 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e"} Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.491381 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.491454 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.491565 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.492893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.492930 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.492944 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.492956 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.492986 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.492995 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.493021 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.493055 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:44 crc kubenswrapper[4860]: I0320 10:54:44.493073 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.494419 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.495870 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.495915 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.495934 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.648132 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.953215 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.953580 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.955750 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.955820 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:45 crc kubenswrapper[4860]: I0320 10:54:45.955834 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.497302 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.498708 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.498771 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.498795 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.900831 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.901064 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.902656 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.902707 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:46 crc kubenswrapper[4860]: I0320 10:54:46.902719 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:47 crc kubenswrapper[4860]: I0320 10:54:47.367938 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:47 crc kubenswrapper[4860]: I0320 10:54:47.499536 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:47 crc kubenswrapper[4860]: I0320 10:54:47.500477 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:47 crc kubenswrapper[4860]: I0320 10:54:47.500510 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:47 crc kubenswrapper[4860]: I0320 10:54:47.500521 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:47 crc kubenswrapper[4860]: E0320 10:54:47.569708 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:54:49 crc kubenswrapper[4860]: I0320 10:54:49.403029 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 10:54:49 crc kubenswrapper[4860]: I0320 10:54:49.404918 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:49 crc kubenswrapper[4860]: I0320 10:54:49.408286 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:49 crc kubenswrapper[4860]: I0320 10:54:49.408354 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:49 crc kubenswrapper[4860]: I0320 10:54:49.408373 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:49 crc kubenswrapper[4860]: I0320 10:54:49.901736 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:54:49 crc kubenswrapper[4860]: I0320 10:54:49.901894 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 10:54:50 crc kubenswrapper[4860]: I0320 10:54:50.374269 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:50 crc kubenswrapper[4860]: I0320 10:54:50.374445 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:50 crc kubenswrapper[4860]: I0320 10:54:50.376011 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:50 crc kubenswrapper[4860]: I0320 10:54:50.376064 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:50 crc kubenswrapper[4860]: I0320 10:54:50.376077 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.033144 4860 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52966->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.033296 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52966->192.168.126.11:17697: read: connection reset by peer" Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.262018 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.516207 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.525826 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c1e891984b7d45630352f0e37bf11ec8f9f8df49a17c27c868951abcc16e1a89" exitCode=255 Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.525887 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c1e891984b7d45630352f0e37bf11ec8f9f8df49a17c27c868951abcc16e1a89"} Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.526094 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.527245 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.527282 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.527294 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:52 crc kubenswrapper[4860]: I0320 10:54:52.527870 4860 scope.go:117] "RemoveContainer" containerID="c1e891984b7d45630352f0e37bf11ec8f9f8df49a17c27c868951abcc16e1a89" Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.116195 4860 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 10:54:53 crc kubenswrapper[4860]: W0320 10:54:53.214460 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.214555 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:54:53 crc kubenswrapper[4860]: W0320 10:54:53.215878 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.215917 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.216369 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.219140 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e874f2723de82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,LastTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.221530 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 10:54:53 crc kubenswrapper[4860]: W0320 10:54:53.223577 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.223660 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.223925 4860 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.224026 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 10:54:53 crc kubenswrapper[4860]: W0320 10:54:53.226751 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z Mar 20 10:54:53 crc kubenswrapper[4860]: E0320 10:54:53.226853 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.229065 4860 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.229152 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.264510 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:53Z is after 2026-02-23T05:33:13Z Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.531693 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.534193 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd"} Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.534417 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.535281 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.535321 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:53 crc kubenswrapper[4860]: I0320 10:54:53.535333 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.272878 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:54Z is after 2026-02-23T05:33:13Z Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.373632 4860 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]log ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]etcd ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/priority-and-fairness-filter ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-apiextensions-informers ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-apiextensions-controllers ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/crd-informer-synced ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-system-namespaces-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 20 10:54:54 crc kubenswrapper[4860]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/bootstrap-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/start-kube-aggregator-informers ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/apiservice-registration-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/apiservice-discovery-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]autoregister-completion ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/apiservice-openapi-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 20 10:54:54 crc kubenswrapper[4860]: livez check failed Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.373707 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.539102 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.539543 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.541345 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" exitCode=255 Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.541403 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd"} Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.541447 4860 scope.go:117] "RemoveContainer" containerID="c1e891984b7d45630352f0e37bf11ec8f9f8df49a17c27c868951abcc16e1a89" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.541702 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.542901 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.542950 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.542969 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:54 crc kubenswrapper[4860]: I0320 10:54:54.544005 4860 scope.go:117] "RemoveContainer" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" Mar 20 10:54:54 crc kubenswrapper[4860]: E0320 10:54:54.544555 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.263157 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:55Z is after 2026-02-23T05:33:13Z Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.547025 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.674721 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.675017 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.676600 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.676639 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.676650 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:55 crc kubenswrapper[4860]: I0320 10:54:55.691696 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 10:54:56 crc kubenswrapper[4860]: I0320 10:54:56.270499 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:56Z is after 2026-02-23T05:33:13Z Mar 20 10:54:56 crc kubenswrapper[4860]: I0320 10:54:56.555058 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:56 crc kubenswrapper[4860]: I0320 10:54:56.556667 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:56 crc kubenswrapper[4860]: I0320 10:54:56.556723 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:56 crc kubenswrapper[4860]: I0320 10:54:56.556741 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:57 crc kubenswrapper[4860]: I0320 10:54:57.264838 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:57Z is after 2026-02-23T05:33:13Z Mar 20 10:54:57 crc kubenswrapper[4860]: E0320 10:54:57.569965 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:54:58 crc kubenswrapper[4860]: I0320 10:54:58.263964 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:58Z is after 2026-02-23T05:33:13Z Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.263770 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:59Z is after 2026-02-23T05:33:13Z Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.375579 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.375858 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.377420 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.377494 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.377507 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.378208 4860 scope.go:117] "RemoveContainer" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" Mar 20 10:54:59 crc kubenswrapper[4860]: E0320 10:54:59.378435 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.381330 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.562500 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.563987 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.564021 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.564029 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.564530 4860 scope.go:117] "RemoveContainer" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" Mar 20 10:54:59 crc kubenswrapper[4860]: E0320 10:54:59.564692 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.616748 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.618491 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.618525 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.618535 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.618562 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:59 crc kubenswrapper[4860]: E0320 10:54:59.621405 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:59Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:54:59 crc kubenswrapper[4860]: E0320 10:54:59.625686 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:59Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.902280 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:54:59 crc kubenswrapper[4860]: I0320 10:54:59.902389 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:55:00 crc kubenswrapper[4860]: W0320 10:55:00.259153 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:00Z is after 2026-02-23T05:33:13Z Mar 20 10:55:00 crc kubenswrapper[4860]: E0320 10:55:00.259282 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:00 crc kubenswrapper[4860]: I0320 10:55:00.262191 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:00Z is after 2026-02-23T05:33:13Z Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.077588 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.077812 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.079349 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.079409 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.079429 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.080424 4860 scope.go:117] "RemoveContainer" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" Mar 20 10:55:01 crc kubenswrapper[4860]: E0320 10:55:01.080712 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:01 crc kubenswrapper[4860]: W0320 10:55:01.179308 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:01Z is after 2026-02-23T05:33:13Z Mar 20 10:55:01 crc kubenswrapper[4860]: E0320 10:55:01.179422 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.262959 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:01Z is after 2026-02-23T05:33:13Z Mar 20 10:55:01 crc kubenswrapper[4860]: I0320 10:55:01.566340 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:55:01 crc kubenswrapper[4860]: E0320 10:55:01.570171 4860 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:01 crc kubenswrapper[4860]: W0320 10:55:01.576694 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:01Z is after 2026-02-23T05:33:13Z Mar 20 10:55:01 crc kubenswrapper[4860]: E0320 10:55:01.576829 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:02 crc kubenswrapper[4860]: I0320 10:55:02.263430 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:02Z is after 2026-02-23T05:33:13Z Mar 20 10:55:02 crc kubenswrapper[4860]: W0320 10:55:02.637475 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:02Z is after 2026-02-23T05:33:13Z Mar 20 10:55:02 crc kubenswrapper[4860]: E0320 10:55:02.637890 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:02 crc kubenswrapper[4860]: I0320 10:55:02.639665 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:02 crc kubenswrapper[4860]: I0320 10:55:02.639894 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:02 crc kubenswrapper[4860]: I0320 10:55:02.641254 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:02 crc kubenswrapper[4860]: I0320 10:55:02.641352 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:02 crc kubenswrapper[4860]: I0320 10:55:02.641373 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:02 crc kubenswrapper[4860]: I0320 10:55:02.642395 4860 scope.go:117] "RemoveContainer" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" Mar 20 10:55:02 crc kubenswrapper[4860]: E0320 10:55:02.642699 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:03 crc kubenswrapper[4860]: E0320 10:55:03.223737 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:03Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e874f2723de82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,LastTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:03 crc kubenswrapper[4860]: I0320 10:55:03.262993 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:03Z is after 2026-02-23T05:33:13Z Mar 20 10:55:04 crc kubenswrapper[4860]: I0320 10:55:04.263413 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:04Z is after 2026-02-23T05:33:13Z Mar 20 10:55:05 crc kubenswrapper[4860]: I0320 10:55:05.264914 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:05Z is after 2026-02-23T05:33:13Z Mar 20 10:55:06 crc kubenswrapper[4860]: I0320 10:55:06.265253 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:06Z is after 2026-02-23T05:33:13Z Mar 20 10:55:06 crc kubenswrapper[4860]: I0320 10:55:06.621519 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:06 crc kubenswrapper[4860]: I0320 10:55:06.623115 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:06 crc kubenswrapper[4860]: I0320 10:55:06.623180 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:06 crc kubenswrapper[4860]: I0320 10:55:06.623193 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:06 crc kubenswrapper[4860]: I0320 10:55:06.623307 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:06 crc kubenswrapper[4860]: E0320 10:55:06.627098 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:06Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:55:06 crc kubenswrapper[4860]: E0320 10:55:06.629689 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:06Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:55:07 crc kubenswrapper[4860]: I0320 10:55:07.265294 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:07Z is after 2026-02-23T05:33:13Z Mar 20 10:55:07 crc kubenswrapper[4860]: E0320 10:55:07.570097 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:08 crc kubenswrapper[4860]: I0320 10:55:08.265220 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2026-02-23T05:33:13Z Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.266783 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2026-02-23T05:33:13Z Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.902609 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.902756 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.902874 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.903140 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.904901 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.904971 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.904985 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.905622 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"37468ccfc2574c3aefd9c88a616ce0f0c69cfa3a56b41c421d53461d4f8b2cc2"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 10:55:09 crc kubenswrapper[4860]: I0320 10:55:09.905809 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://37468ccfc2574c3aefd9c88a616ce0f0c69cfa3a56b41c421d53461d4f8b2cc2" gracePeriod=30 Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.266501 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2026-02-23T05:33:13Z Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.595510 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.596039 4860 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="37468ccfc2574c3aefd9c88a616ce0f0c69cfa3a56b41c421d53461d4f8b2cc2" exitCode=255 Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.596109 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"37468ccfc2574c3aefd9c88a616ce0f0c69cfa3a56b41c421d53461d4f8b2cc2"} Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.596166 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"82f4a0bcf048582263f9ba78f91872aba9de0ba2e3cce65a31e587c04a849fbc"} Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.596292 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.597638 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.597690 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:10 crc kubenswrapper[4860]: I0320 10:55:10.597704 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:11 crc kubenswrapper[4860]: I0320 10:55:11.263986 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2026-02-23T05:33:13Z Mar 20 10:55:12 crc kubenswrapper[4860]: I0320 10:55:12.263804 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2026-02-23T05:33:13Z Mar 20 10:55:13 crc kubenswrapper[4860]: E0320 10:55:13.228945 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e874f2723de82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,LastTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:13 crc kubenswrapper[4860]: I0320 10:55:13.265996 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2026-02-23T05:33:13Z Mar 20 10:55:13 crc kubenswrapper[4860]: I0320 10:55:13.627965 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:13 crc kubenswrapper[4860]: I0320 10:55:13.629762 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:13 crc kubenswrapper[4860]: I0320 10:55:13.629826 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:13 crc kubenswrapper[4860]: I0320 10:55:13.629846 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:13 crc kubenswrapper[4860]: I0320 10:55:13.629882 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:13 crc kubenswrapper[4860]: E0320 10:55:13.632601 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:55:13 crc kubenswrapper[4860]: E0320 10:55:13.632992 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:55:14 crc kubenswrapper[4860]: I0320 10:55:14.262629 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2026-02-23T05:33:13Z Mar 20 10:55:15 crc kubenswrapper[4860]: I0320 10:55:15.263752 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2026-02-23T05:33:13Z Mar 20 10:55:16 crc kubenswrapper[4860]: I0320 10:55:16.262993 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2026-02-23T05:33:13Z Mar 20 10:55:16 crc kubenswrapper[4860]: I0320 10:55:16.901118 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:16 crc kubenswrapper[4860]: I0320 10:55:16.901320 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:16 crc kubenswrapper[4860]: I0320 10:55:16.905455 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4860]: I0320 10:55:16.905501 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4860]: I0320 10:55:16.905515 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.266055 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2026-02-23T05:33:13Z Mar 20 10:55:17 crc kubenswrapper[4860]: W0320 10:55:17.269347 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2026-02-23T05:33:13Z Mar 20 10:55:17 crc kubenswrapper[4860]: E0320 10:55:17.269436 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:17 crc kubenswrapper[4860]: W0320 10:55:17.305614 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2026-02-23T05:33:13Z Mar 20 10:55:17 crc kubenswrapper[4860]: E0320 10:55:17.305715 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.368558 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.413681 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.416116 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.416365 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.416479 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.417117 4860 scope.go:117] "RemoveContainer" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" Mar 20 10:55:17 crc kubenswrapper[4860]: E0320 10:55:17.570369 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.620312 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.621886 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e"} Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.621962 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.622196 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.623832 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.623882 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.623898 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.624664 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.624691 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4860]: I0320 10:55:17.624703 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4860]: W0320 10:55:17.641554 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2026-02-23T05:33:13Z Mar 20 10:55:17 crc kubenswrapper[4860]: E0320 10:55:17.641666 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.264881 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2026-02-23T05:33:13Z Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.431422 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:55:18 crc kubenswrapper[4860]: E0320 10:55:18.436085 4860 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:18 crc kubenswrapper[4860]: E0320 10:55:18.437384 4860 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.626712 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.627470 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.630500 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e" exitCode=255 Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.630567 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e"} Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.630632 4860 scope.go:117] "RemoveContainer" containerID="79b2c88e091df7e4cc5a494a7746ce70e88e04622191ee0280af1de52f4a5ddd" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.630840 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.632034 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.632078 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.632092 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4860]: I0320 10:55:18.632904 4860 scope.go:117] "RemoveContainer" containerID="cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e" Mar 20 10:55:18 crc kubenswrapper[4860]: E0320 10:55:18.633143 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:19 crc kubenswrapper[4860]: I0320 10:55:19.264631 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2026-02-23T05:33:13Z Mar 20 10:55:19 crc kubenswrapper[4860]: I0320 10:55:19.636267 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:55:19 crc kubenswrapper[4860]: I0320 10:55:19.902030 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:55:19 crc kubenswrapper[4860]: I0320 10:55:19.902129 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:55:20 crc kubenswrapper[4860]: I0320 10:55:20.264928 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2026-02-23T05:33:13Z Mar 20 10:55:20 crc kubenswrapper[4860]: I0320 10:55:20.633395 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:20 crc kubenswrapper[4860]: I0320 10:55:20.635219 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:20 crc kubenswrapper[4860]: I0320 10:55:20.635288 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:20 crc kubenswrapper[4860]: I0320 10:55:20.635301 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:20 crc kubenswrapper[4860]: I0320 10:55:20.635335 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:20 crc kubenswrapper[4860]: E0320 10:55:20.636426 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:55:20 crc kubenswrapper[4860]: E0320 10:55:20.637937 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:55:21 crc kubenswrapper[4860]: I0320 10:55:21.077628 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:21 crc kubenswrapper[4860]: I0320 10:55:21.077823 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:21 crc kubenswrapper[4860]: I0320 10:55:21.079117 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:21 crc kubenswrapper[4860]: I0320 10:55:21.079150 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:21 crc kubenswrapper[4860]: I0320 10:55:21.079161 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:21 crc kubenswrapper[4860]: I0320 10:55:21.079841 4860 scope.go:117] "RemoveContainer" containerID="cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e" Mar 20 10:55:21 crc kubenswrapper[4860]: E0320 10:55:21.080050 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:21 crc kubenswrapper[4860]: I0320 10:55:21.263464 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2026-02-23T05:33:13Z Mar 20 10:55:22 crc kubenswrapper[4860]: I0320 10:55:22.262693 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2026-02-23T05:33:13Z Mar 20 10:55:22 crc kubenswrapper[4860]: I0320 10:55:22.639700 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:22 crc kubenswrapper[4860]: I0320 10:55:22.639949 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:22 crc kubenswrapper[4860]: I0320 10:55:22.641605 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:22 crc kubenswrapper[4860]: I0320 10:55:22.641638 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:22 crc kubenswrapper[4860]: I0320 10:55:22.641651 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:22 crc kubenswrapper[4860]: I0320 10:55:22.642405 4860 scope.go:117] "RemoveContainer" containerID="cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e" Mar 20 10:55:22 crc kubenswrapper[4860]: E0320 10:55:22.642621 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:23 crc kubenswrapper[4860]: E0320 10:55:23.232818 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e874f2723de82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,LastTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:23 crc kubenswrapper[4860]: I0320 10:55:23.263131 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2026-02-23T05:33:13Z Mar 20 10:55:24 crc kubenswrapper[4860]: I0320 10:55:24.265540 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:24Z is after 2026-02-23T05:33:13Z Mar 20 10:55:25 crc kubenswrapper[4860]: I0320 10:55:25.262874 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:25Z is after 2026-02-23T05:33:13Z Mar 20 10:55:26 crc kubenswrapper[4860]: I0320 10:55:26.263441 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:26Z is after 2026-02-23T05:33:13Z Mar 20 10:55:26 crc kubenswrapper[4860]: W0320 10:55:26.695990 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:26Z is after 2026-02-23T05:33:13Z Mar 20 10:55:26 crc kubenswrapper[4860]: E0320 10:55:26.696090 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:27 crc kubenswrapper[4860]: I0320 10:55:27.264922 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:27Z is after 2026-02-23T05:33:13Z Mar 20 10:55:27 crc kubenswrapper[4860]: E0320 10:55:27.570609 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:27 crc kubenswrapper[4860]: I0320 10:55:27.638426 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:27 crc kubenswrapper[4860]: I0320 10:55:27.640004 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:27 crc kubenswrapper[4860]: I0320 10:55:27.640051 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:27 crc kubenswrapper[4860]: I0320 10:55:27.640065 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:27 crc kubenswrapper[4860]: I0320 10:55:27.640096 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:27 crc kubenswrapper[4860]: E0320 10:55:27.643491 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:27Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:55:27 crc kubenswrapper[4860]: E0320 10:55:27.643743 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:27Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:55:28 crc kubenswrapper[4860]: I0320 10:55:28.264430 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:28Z is after 2026-02-23T05:33:13Z Mar 20 10:55:29 crc kubenswrapper[4860]: I0320 10:55:29.264727 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:29Z is after 2026-02-23T05:33:13Z Mar 20 10:55:29 crc kubenswrapper[4860]: I0320 10:55:29.902261 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:55:29 crc kubenswrapper[4860]: I0320 10:55:29.902338 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:55:30 crc kubenswrapper[4860]: I0320 10:55:30.264866 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:30Z is after 2026-02-23T05:33:13Z Mar 20 10:55:31 crc kubenswrapper[4860]: I0320 10:55:31.263205 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:31Z is after 2026-02-23T05:33:13Z Mar 20 10:55:32 crc kubenswrapper[4860]: I0320 10:55:32.262524 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:32Z is after 2026-02-23T05:33:13Z Mar 20 10:55:33 crc kubenswrapper[4860]: E0320 10:55:33.237408 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:33Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e874f2723de82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,LastTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:33 crc kubenswrapper[4860]: I0320 10:55:33.262960 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:33Z is after 2026-02-23T05:33:13Z Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.083003 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.083290 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.084609 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.084682 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.084700 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.262947 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:34Z is after 2026-02-23T05:33:13Z Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.644128 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.645987 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.646089 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.646110 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4860]: I0320 10:55:34.646160 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:34 crc kubenswrapper[4860]: E0320 10:55:34.649758 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:34Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:55:34 crc kubenswrapper[4860]: E0320 10:55:34.651285 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:34Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:55:35 crc kubenswrapper[4860]: I0320 10:55:35.263390 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2026-02-23T05:33:13Z Mar 20 10:55:36 crc kubenswrapper[4860]: I0320 10:55:36.264876 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2026-02-23T05:33:13Z Mar 20 10:55:37 crc kubenswrapper[4860]: I0320 10:55:37.263269 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2026-02-23T05:33:13Z Mar 20 10:55:37 crc kubenswrapper[4860]: I0320 10:55:37.413268 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:37 crc kubenswrapper[4860]: I0320 10:55:37.418663 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:37 crc kubenswrapper[4860]: I0320 10:55:37.418707 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:37 crc kubenswrapper[4860]: I0320 10:55:37.418719 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:37 crc kubenswrapper[4860]: I0320 10:55:37.419477 4860 scope.go:117] "RemoveContainer" containerID="cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e" Mar 20 10:55:37 crc kubenswrapper[4860]: E0320 10:55:37.419679 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:37 crc kubenswrapper[4860]: E0320 10:55:37.571037 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:38 crc kubenswrapper[4860]: I0320 10:55:38.268695 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.264815 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.902697 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.903160 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.903297 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.903478 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.905041 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.905462 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.905637 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.906525 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"82f4a0bcf048582263f9ba78f91872aba9de0ba2e3cce65a31e587c04a849fbc"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 10:55:39 crc kubenswrapper[4860]: I0320 10:55:39.906841 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://82f4a0bcf048582263f9ba78f91872aba9de0ba2e3cce65a31e587c04a849fbc" gracePeriod=30 Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.264102 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.699091 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.701030 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.701495 4860 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="82f4a0bcf048582263f9ba78f91872aba9de0ba2e3cce65a31e587c04a849fbc" exitCode=255 Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.701636 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"82f4a0bcf048582263f9ba78f91872aba9de0ba2e3cce65a31e587c04a849fbc"} Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.701739 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cf2624c09c9ed0c88340cb5c33a9f304b84e7f10b768178bc03980768edd770b"} Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.701832 4860 scope.go:117] "RemoveContainer" containerID="37468ccfc2574c3aefd9c88a616ce0f0c69cfa3a56b41c421d53461d4f8b2cc2" Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.702075 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.703387 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.703522 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:40 crc kubenswrapper[4860]: I0320 10:55:40.703608 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:41 crc kubenswrapper[4860]: I0320 10:55:41.267736 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:41 crc kubenswrapper[4860]: I0320 10:55:41.652263 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:41 crc kubenswrapper[4860]: I0320 10:55:41.654216 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:41 crc kubenswrapper[4860]: I0320 10:55:41.654347 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:41 crc kubenswrapper[4860]: I0320 10:55:41.654364 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:41 crc kubenswrapper[4860]: I0320 10:55:41.654402 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:41 crc kubenswrapper[4860]: E0320 10:55:41.658054 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:55:41 crc kubenswrapper[4860]: E0320 10:55:41.658175 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:55:41 crc kubenswrapper[4860]: I0320 10:55:41.706862 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 10:55:42 crc kubenswrapper[4860]: I0320 10:55:42.265714 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.244827 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2723de82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,LastTimestamp:2026-03-20 10:54:37.259382402 +0000 UTC m=+1.480743300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.248929 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.254626 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.259140 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c76057c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,LastTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.263939 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f39095a90 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.559634576 +0000 UTC m=+1.780995504,LastTimestamp:2026-03-20 10:54:37.559634576 +0000 UTC m=+1.780995504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: I0320 10:55:43.264076 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.265703 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c7541d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.661779981 +0000 UTC m=+1.883140879,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.270844 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c75c917\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.661804402 +0000 UTC m=+1.883165300,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.275770 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c76057c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c76057c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,LastTimestamp:2026-03-20 10:54:37.661815242 +0000 UTC m=+1.883176140,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.282658 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c7541d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.714601264 +0000 UTC m=+1.935962172,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.288347 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c75c917\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.714630825 +0000 UTC m=+1.935991713,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.291427 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c76057c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c76057c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,LastTimestamp:2026-03-20 10:54:37.714640035 +0000 UTC m=+1.936000933,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.295963 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c7541d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.716196341 +0000 UTC m=+1.937557239,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.300335 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c75c917\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.716212851 +0000 UTC m=+1.937573749,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.304811 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c76057c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c76057c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,LastTimestamp:2026-03-20 10:54:37.716237651 +0000 UTC m=+1.937598549,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.310193 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c7541d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.717176103 +0000 UTC m=+1.938536991,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.314788 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c75c917\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.717197144 +0000 UTC m=+1.938558042,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.318999 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c76057c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c76057c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,LastTimestamp:2026-03-20 10:54:37.717206964 +0000 UTC m=+1.938567862,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.323880 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c7541d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.717638014 +0000 UTC m=+1.938998912,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.329398 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c75c917\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.717660774 +0000 UTC m=+1.939021672,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.334099 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c76057c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c76057c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,LastTimestamp:2026-03-20 10:54:37.717669384 +0000 UTC m=+1.939030282,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.340165 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c7541d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.717732256 +0000 UTC m=+1.939093154,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.344915 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c75c917\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.717750926 +0000 UTC m=+1.939111824,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.350118 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c76057c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c76057c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348652412 +0000 UTC m=+1.570013320,LastTimestamp:2026-03-20 10:54:37.717769097 +0000 UTC m=+1.939129995,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.354118 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c7541d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c7541d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.34860232 +0000 UTC m=+1.569963238,LastTimestamp:2026-03-20 10:54:37.718692558 +0000 UTC m=+1.940053456,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.358405 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e874f2c75c917\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e874f2c75c917 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:37.348636951 +0000 UTC m=+1.569997869,LastTimestamp:2026-03-20 10:54:37.718710698 +0000 UTC m=+1.940071596,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.364002 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e874f60ce5e36 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:38.226857526 +0000 UTC m=+2.448218424,LastTimestamp:2026-03-20 10:54:38.226857526 +0000 UTC m=+2.448218424,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.368406 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874f60cfff40 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:38.226964288 +0000 UTC m=+2.448325216,LastTimestamp:2026-03-20 10:54:38.226964288 +0000 UTC m=+2.448325216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.371609 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874f60fc9af8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:38.229887736 +0000 UTC m=+2.451248634,LastTimestamp:2026-03-20 10:54:38.229887736 +0000 UTC m=+2.451248634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.375086 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874f613b23e4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:38.23398602 +0000 UTC m=+2.455346948,LastTimestamp:2026-03-20 10:54:38.23398602 +0000 UTC m=+2.455346948,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.378522 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e874f6148876e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:38.23486347 +0000 UTC m=+2.456224408,LastTimestamp:2026-03-20 10:54:38.23486347 +0000 UTC m=+2.456224408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.382659 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fd63a618b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.196870539 +0000 UTC m=+4.418231447,LastTimestamp:2026-03-20 10:54:40.196870539 +0000 UTC m=+4.418231447,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.386386 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874fd6a9cfd3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.204173267 +0000 UTC m=+4.425534215,LastTimestamp:2026-03-20 10:54:40.204173267 +0000 UTC m=+4.425534215,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.389870 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e874fd6d960c9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.207290569 +0000 UTC m=+4.428651477,LastTimestamp:2026-03-20 10:54:40.207290569 +0000 UTC m=+4.428651477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.394300 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874fd6f64732 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.209184562 +0000 UTC m=+4.430545480,LastTimestamp:2026-03-20 10:54:40.209184562 +0000 UTC m=+4.430545480,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.397985 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e874fd70c7ca3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.210640035 +0000 UTC m=+4.432000943,LastTimestamp:2026-03-20 10:54:40.210640035 +0000 UTC m=+4.432000943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.402152 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fd7233740 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.2121296 +0000 UTC m=+4.433490508,LastTimestamp:2026-03-20 10:54:40.2121296 +0000 UTC m=+4.433490508,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.405502 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fd73d7a7d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.213850749 +0000 UTC m=+4.435211657,LastTimestamp:2026-03-20 10:54:40.213850749 +0000 UTC m=+4.435211657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.408880 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874fd798b6f6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.219830006 +0000 UTC m=+4.441190914,LastTimestamp:2026-03-20 10:54:40.219830006 +0000 UTC m=+4.441190914,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.412108 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e874fd7b5e46e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.22174219 +0000 UTC m=+4.443103098,LastTimestamp:2026-03-20 10:54:40.22174219 +0000 UTC m=+4.443103098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.415342 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e874fd854404f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.232120399 +0000 UTC m=+4.453481337,LastTimestamp:2026-03-20 10:54:40.232120399 +0000 UTC m=+4.453481337,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.419057 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874fd88077dd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.235018205 +0000 UTC m=+4.456379113,LastTimestamp:2026-03-20 10:54:40.235018205 +0000 UTC m=+4.456379113,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.424033 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874fe4a4f59b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.438736283 +0000 UTC m=+4.660097191,LastTimestamp:2026-03-20 10:54:40.438736283 +0000 UTC m=+4.660097191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.427584 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e874fe4b9be96 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.440098454 +0000 UTC m=+4.661459352,LastTimestamp:2026-03-20 10:54:40.440098454 +0000 UTC m=+4.661459352,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.431288 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874fe4d9cd2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.442199342 +0000 UTC m=+4.663560250,LastTimestamp:2026-03-20 10:54:40.442199342 +0000 UTC m=+4.663560250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.435312 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e874fe4ea8f79 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.443297657 +0000 UTC m=+4.664658595,LastTimestamp:2026-03-20 10:54:40.443297657 +0000 UTC m=+4.664658595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.440617 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fec61e627 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.568559143 +0000 UTC m=+4.789920041,LastTimestamp:2026-03-20 10:54:40.568559143 +0000 UTC m=+4.789920041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.445214 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fed485a3b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.583662139 +0000 UTC m=+4.805023037,LastTimestamp:2026-03-20 10:54:40.583662139 +0000 UTC m=+4.805023037,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.450299 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fed587aa6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.584719014 +0000 UTC m=+4.806079912,LastTimestamp:2026-03-20 10:54:40.584719014 +0000 UTC m=+4.806079912,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.455592 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874ff0ef6b36 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.644942646 +0000 UTC m=+4.866303544,LastTimestamp:2026-03-20 10:54:40.644942646 +0000 UTC m=+4.866303544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.460360 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e874ff0ffb71b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.646010651 +0000 UTC m=+4.867371569,LastTimestamp:2026-03-20 10:54:40.646010651 +0000 UTC m=+4.867371569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.465114 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874ff106652f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.646448431 +0000 UTC m=+4.867809329,LastTimestamp:2026-03-20 10:54:40.646448431 +0000 UTC m=+4.867809329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.469955 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e874ff1069eff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.646463231 +0000 UTC m=+4.867824119,LastTimestamp:2026-03-20 10:54:40.646463231 +0000 UTC m=+4.867824119,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.474614 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874ff1ab3521 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.657249569 +0000 UTC m=+4.878610477,LastTimestamp:2026-03-20 10:54:40.657249569 +0000 UTC m=+4.878610477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.480302 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874ff1ba76d8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.658249432 +0000 UTC m=+4.879610340,LastTimestamp:2026-03-20 10:54:40.658249432 +0000 UTC m=+4.879610340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.485544 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874ff213a284 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.664093316 +0000 UTC m=+4.885454224,LastTimestamp:2026-03-20 10:54:40.664093316 +0000 UTC m=+4.885454224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.490552 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874ff24616b6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.667399862 +0000 UTC m=+4.888760760,LastTimestamp:2026-03-20 10:54:40.667399862 +0000 UTC m=+4.888760760,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.495177 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e874ff2a3a8f9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.673532153 +0000 UTC m=+4.894893051,LastTimestamp:2026-03-20 10:54:40.673532153 +0000 UTC m=+4.894893051,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.496749 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e874ff2a3bd21 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.673537313 +0000 UTC m=+4.894898211,LastTimestamp:2026-03-20 10:54:40.673537313 +0000 UTC m=+4.894898211,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.501318 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874ff83f1d2f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.767606063 +0000 UTC m=+4.988966961,LastTimestamp:2026-03-20 10:54:40.767606063 +0000 UTC m=+4.988966961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.506830 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874ff934dcab openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.783711403 +0000 UTC m=+5.005072301,LastTimestamp:2026-03-20 10:54:40.783711403 +0000 UTC m=+5.005072301,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.513124 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874ff94892e0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.785003232 +0000 UTC m=+5.006364130,LastTimestamp:2026-03-20 10:54:40.785003232 +0000 UTC m=+5.006364130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.518580 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874ffc11fd8a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.831757706 +0000 UTC m=+5.053118604,LastTimestamp:2026-03-20 10:54:40.831757706 +0000 UTC m=+5.053118604,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.523487 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874ffc9b0a72 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.840739442 +0000 UTC m=+5.062100340,LastTimestamp:2026-03-20 10:54:40.840739442 +0000 UTC m=+5.062100340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.530648 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874ffce68c94 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.845687956 +0000 UTC m=+5.067048854,LastTimestamp:2026-03-20 10:54:40.845687956 +0000 UTC m=+5.067048854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.535707 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e874ffd019412 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.847459346 +0000 UTC m=+5.068820244,LastTimestamp:2026-03-20 10:54:40.847459346 +0000 UTC m=+5.068820244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.540269 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874ffd58c5b8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.853173688 +0000 UTC m=+5.074534586,LastTimestamp:2026-03-20 10:54:40.853173688 +0000 UTC m=+5.074534586,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.544579 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e874ffd726746 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.854853446 +0000 UTC m=+5.076214344,LastTimestamp:2026-03-20 10:54:40.854853446 +0000 UTC m=+5.076214344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.549046 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e875003aa3071 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.959172721 +0000 UTC m=+5.180533619,LastTimestamp:2026-03-20 10:54:40.959172721 +0000 UTC m=+5.180533619,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.553497 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e875004a41d7c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.975551868 +0000 UTC m=+5.196912766,LastTimestamp:2026-03-20 10:54:40.975551868 +0000 UTC m=+5.196912766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.558013 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87500692d6c2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.007974082 +0000 UTC m=+5.229334980,LastTimestamp:2026-03-20 10:54:41.007974082 +0000 UTC m=+5.229334980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.562253 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e875006a56e84 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.00919258 +0000 UTC m=+5.230553478,LastTimestamp:2026-03-20 10:54:41.00919258 +0000 UTC m=+5.230553478,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.567318 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e875007d5e14c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.029144908 +0000 UTC m=+5.250505806,LastTimestamp:2026-03-20 10:54:41.029144908 +0000 UTC m=+5.250505806,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.571446 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e875007ed79fc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.030691324 +0000 UTC m=+5.252052222,LastTimestamp:2026-03-20 10:54:41.030691324 +0000 UTC m=+5.252052222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.575639 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8750080b3a90 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.032641168 +0000 UTC m=+5.254002066,LastTimestamp:2026-03-20 10:54:41.032641168 +0000 UTC m=+5.254002066,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.580742 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8750122e2f0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.202704143 +0000 UTC m=+5.424065041,LastTimestamp:2026-03-20 10:54:41.202704143 +0000 UTC m=+5.424065041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.585106 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e875012f0e138 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.215463736 +0000 UTC m=+5.436824634,LastTimestamp:2026-03-20 10:54:41.215463736 +0000 UTC m=+5.436824634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.589475 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8750130872d6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.217008342 +0000 UTC m=+5.438369240,LastTimestamp:2026-03-20 10:54:41.217008342 +0000 UTC m=+5.438369240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.593526 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87501e35ce1f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.404530207 +0000 UTC m=+5.625891105,LastTimestamp:2026-03-20 10:54:41.404530207 +0000 UTC m=+5.625891105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.598543 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87501ee0c999 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.415735705 +0000 UTC m=+5.637096603,LastTimestamp:2026-03-20 10:54:41.415735705 +0000 UTC m=+5.637096603,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.601538 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8750218c4bad openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.460530093 +0000 UTC m=+5.681890991,LastTimestamp:2026-03-20 10:54:41.460530093 +0000 UTC m=+5.681890991,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.603732 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87502c96fe07 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.645780487 +0000 UTC m=+5.867141385,LastTimestamp:2026-03-20 10:54:41.645780487 +0000 UTC m=+5.867141385,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.606921 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87502df3733d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.668617021 +0000 UTC m=+5.889977919,LastTimestamp:2026-03-20 10:54:41.668617021 +0000 UTC m=+5.889977919,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.611618 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87505e4bb584 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:42.479707524 +0000 UTC m=+6.701068432,LastTimestamp:2026-03-20 10:54:42.479707524 +0000 UTC m=+6.701068432,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.615891 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8750696709a9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:42.666047913 +0000 UTC m=+6.887408811,LastTimestamp:2026-03-20 10:54:42.666047913 +0000 UTC m=+6.887408811,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.619782 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e875069ff3ab6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:42.676021942 +0000 UTC m=+6.897382850,LastTimestamp:2026-03-20 10:54:42.676021942 +0000 UTC m=+6.897382850,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.624101 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87506a124406 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:42.67726951 +0000 UTC m=+6.898630408,LastTimestamp:2026-03-20 10:54:42.67726951 +0000 UTC m=+6.898630408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.628180 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87507696517e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:42.887250302 +0000 UTC m=+7.108611210,LastTimestamp:2026-03-20 10:54:42.887250302 +0000 UTC m=+7.108611210,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.632065 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8750774fb72b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:42.899400491 +0000 UTC m=+7.120761429,LastTimestamp:2026-03-20 10:54:42.899400491 +0000 UTC m=+7.120761429,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.638251 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8750776304ba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:42.90066553 +0000 UTC m=+7.122026438,LastTimestamp:2026-03-20 10:54:42.90066553 +0000 UTC m=+7.122026438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.643057 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e875082492402 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.083518978 +0000 UTC m=+7.304879866,LastTimestamp:2026-03-20 10:54:43.083518978 +0000 UTC m=+7.304879866,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.647029 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e875083347dac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.098942892 +0000 UTC m=+7.320303790,LastTimestamp:2026-03-20 10:54:43.098942892 +0000 UTC m=+7.320303790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.651245 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8750834705a6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.10015735 +0000 UTC m=+7.321518248,LastTimestamp:2026-03-20 10:54:43.10015735 +0000 UTC m=+7.321518248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.655127 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e875092656d56 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.353808214 +0000 UTC m=+7.575169142,LastTimestamp:2026-03-20 10:54:43.353808214 +0000 UTC m=+7.575169142,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.658442 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87509329a424 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.3666673 +0000 UTC m=+7.588028208,LastTimestamp:2026-03-20 10:54:43.3666673 +0000 UTC m=+7.588028208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.663174 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87509345a8a6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.368503462 +0000 UTC m=+7.589864360,LastTimestamp:2026-03-20 10:54:43.368503462 +0000 UTC m=+7.589864360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.667897 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87509d92a816 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.54132175 +0000 UTC m=+7.762682648,LastTimestamp:2026-03-20 10:54:43.54132175 +0000 UTC m=+7.762682648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.672029 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87509e3ea95d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:43.552594269 +0000 UTC m=+7.773955167,LastTimestamp:2026-03-20 10:54:43.552594269 +0000 UTC m=+7.773955167,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.678811 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:55:43 crc kubenswrapper[4860]: &Event{ObjectMeta:{kube-controller-manager-crc.189e875218b0595c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 10:55:43 crc kubenswrapper[4860]: body: Mar 20 10:55:43 crc kubenswrapper[4860]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:49.90183254 +0000 UTC m=+14.123193448,LastTimestamp:2026-03-20 10:54:49.90183254 +0000 UTC m=+14.123193448,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:43 crc kubenswrapper[4860]: > Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.683288 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e875218b21afb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:49.901947643 +0000 UTC m=+14.123308561,LastTimestamp:2026-03-20 10:54:49.901947643 +0000 UTC m=+14.123308561,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.688832 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 10:55:43 crc kubenswrapper[4860]: &Event{ObjectMeta:{kube-apiserver-crc.189e875297babd86 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:52966->192.168.126.11:17697: read: connection reset by peer Mar 20 10:55:43 crc kubenswrapper[4860]: body: Mar 20 10:55:43 crc kubenswrapper[4860]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:52.033219974 +0000 UTC m=+16.254580912,LastTimestamp:2026-03-20 10:54:52.033219974 +0000 UTC m=+16.254580912,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:43 crc kubenswrapper[4860]: > Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.694132 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e875297bcc89f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52966->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:52.033353887 +0000 UTC m=+16.254714825,LastTimestamp:2026-03-20 10:54:52.033353887 +0000 UTC m=+16.254714825,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.699023 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8750130872d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8750130872d6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.217008342 +0000 UTC m=+5.438369240,LastTimestamp:2026-03-20 10:54:52.52982466 +0000 UTC m=+16.751185558,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.702899 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e87501e35ce1f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87501e35ce1f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.404530207 +0000 UTC m=+5.625891105,LastTimestamp:2026-03-20 10:54:52.716404542 +0000 UTC m=+16.937765460,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.705888 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e87501ee0c999\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87501ee0c999 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:41.415735705 +0000 UTC m=+5.637096603,LastTimestamp:2026-03-20 10:54:52.729470066 +0000 UTC m=+16.950830964,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.710008 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 10:55:43 crc kubenswrapper[4860]: &Event{ObjectMeta:{kube-apiserver-crc.189e8752deb48785 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 10:55:43 crc kubenswrapper[4860]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 10:55:43 crc kubenswrapper[4860]: Mar 20 10:55:43 crc kubenswrapper[4860]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:53.223995269 +0000 UTC m=+17.445356187,LastTimestamp:2026-03-20 10:54:53.223995269 +0000 UTC m=+17.445356187,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:43 crc kubenswrapper[4860]: > Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.717177 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8752deb5a5c7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:53.224068551 +0000 UTC m=+17.445429459,LastTimestamp:2026-03-20 10:54:53.224068551 +0000 UTC m=+17.445429459,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.722773 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:55:43 crc kubenswrapper[4860]: &Event{ObjectMeta:{kube-controller-manager-crc.189e87546cc45667 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:55:43 crc kubenswrapper[4860]: body: Mar 20 10:55:43 crc kubenswrapper[4860]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:59.902363239 +0000 UTC m=+24.123724147,LastTimestamp:2026-03-20 10:54:59.902363239 +0000 UTC m=+24.123724147,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:43 crc kubenswrapper[4860]: > Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.726876 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87546cc52df4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:59.90241842 +0000 UTC m=+24.123779318,LastTimestamp:2026-03-20 10:54:59.90241842 +0000 UTC m=+24.123779318,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.733185 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e87546cc45667\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:55:43 crc kubenswrapper[4860]: &Event{ObjectMeta:{kube-controller-manager-crc.189e87546cc45667 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:55:43 crc kubenswrapper[4860]: body: Mar 20 10:55:43 crc kubenswrapper[4860]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:59.902363239 +0000 UTC m=+24.123724147,LastTimestamp:2026-03-20 10:55:09.902721023 +0000 UTC m=+34.124081921,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:43 crc kubenswrapper[4860]: > Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.737703 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e87546cc52df4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87546cc52df4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:59.90241842 +0000 UTC m=+24.123779318,LastTimestamp:2026-03-20 10:55:09.902814535 +0000 UTC m=+34.124175433,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.744055 4860 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8756c1047be9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:09.905787881 +0000 UTC m=+34.127148779,LastTimestamp:2026-03-20 10:55:09.905787881 +0000 UTC m=+34.127148779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.747826 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e874fd73d7a7d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fd73d7a7d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.213850749 +0000 UTC m=+4.435211657,LastTimestamp:2026-03-20 10:55:10.022453539 +0000 UTC m=+34.243814447,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.752122 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e874fec61e627\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fec61e627 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.568559143 +0000 UTC m=+4.789920041,LastTimestamp:2026-03-20 10:55:10.167997924 +0000 UTC m=+34.389358822,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.755597 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e874fed485a3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874fed485a3b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:40.583662139 +0000 UTC m=+4.805023037,LastTimestamp:2026-03-20 10:55:10.176287726 +0000 UTC m=+34.397648624,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.761592 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e87546cc45667\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:55:43 crc kubenswrapper[4860]: &Event{ObjectMeta:{kube-controller-manager-crc.189e87546cc45667 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:55:43 crc kubenswrapper[4860]: body: Mar 20 10:55:43 crc kubenswrapper[4860]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:59.902363239 +0000 UTC m=+24.123724147,LastTimestamp:2026-03-20 10:55:19.902094399 +0000 UTC m=+44.123455317,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:43 crc kubenswrapper[4860]: > Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.766440 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e87546cc52df4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87546cc52df4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:59.90241842 +0000 UTC m=+24.123779318,LastTimestamp:2026-03-20 10:55:19.902722495 +0000 UTC m=+44.124083413,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:43 crc kubenswrapper[4860]: E0320 10:55:43.771474 4860 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e87546cc45667\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:55:43 crc kubenswrapper[4860]: &Event{ObjectMeta:{kube-controller-manager-crc.189e87546cc45667 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:55:43 crc kubenswrapper[4860]: body: Mar 20 10:55:43 crc kubenswrapper[4860]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:59.902363239 +0000 UTC m=+24.123724147,LastTimestamp:2026-03-20 10:55:29.902317716 +0000 UTC m=+54.123678624,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:43 crc kubenswrapper[4860]: > Mar 20 10:55:44 crc kubenswrapper[4860]: I0320 10:55:44.266254 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:45 crc kubenswrapper[4860]: I0320 10:55:45.263050 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:46 crc kubenswrapper[4860]: I0320 10:55:46.263492 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:46 crc kubenswrapper[4860]: I0320 10:55:46.901729 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:46 crc kubenswrapper[4860]: I0320 10:55:46.901916 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:46 crc kubenswrapper[4860]: I0320 10:55:46.902956 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:46 crc kubenswrapper[4860]: I0320 10:55:46.903001 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:46 crc kubenswrapper[4860]: I0320 10:55:46.903009 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:47 crc kubenswrapper[4860]: I0320 10:55:47.264938 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:47 crc kubenswrapper[4860]: I0320 10:55:47.368832 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:47 crc kubenswrapper[4860]: E0320 10:55:47.571587 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:47 crc kubenswrapper[4860]: I0320 10:55:47.726265 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:47 crc kubenswrapper[4860]: I0320 10:55:47.727398 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:47 crc kubenswrapper[4860]: I0320 10:55:47.727456 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:47 crc kubenswrapper[4860]: I0320 10:55:47.727474 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.264707 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.658551 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.661562 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.661622 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.661671 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.661719 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:48 crc kubenswrapper[4860]: E0320 10:55:48.663823 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:55:48 crc kubenswrapper[4860]: E0320 10:55:48.664152 4860 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.732844 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.732996 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.734280 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.734306 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:48 crc kubenswrapper[4860]: I0320 10:55:48.734317 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.270265 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.413267 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.414758 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.414832 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.414850 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.415720 4860 scope.go:117] "RemoveContainer" containerID="cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.731972 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.733734 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa"} Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.733936 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.735072 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.735129 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:49 crc kubenswrapper[4860]: I0320 10:55:49.735150 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:49 crc kubenswrapper[4860]: W0320 10:55:49.949023 4860 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 10:55:49 crc kubenswrapper[4860]: E0320 10:55:49.949086 4860 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.263725 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.439697 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.466848 4860 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.738560 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.739113 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.740735 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" exitCode=255 Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.740779 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa"} Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.740819 4860 scope.go:117] "RemoveContainer" containerID="cf65a1d8049a22874f20125f441990ba9f9e338d0f8b206706a27079a88b069e" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.740972 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.747643 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.747717 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.747736 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:50 crc kubenswrapper[4860]: I0320 10:55:50.748504 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:55:50 crc kubenswrapper[4860]: E0320 10:55:50.748728 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.077277 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.273364 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.746047 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.749002 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.749866 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.749899 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.749908 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:51 crc kubenswrapper[4860]: I0320 10:55:51.750554 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:55:51 crc kubenswrapper[4860]: E0320 10:55:51.750852 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.260894 4860 csr.go:261] certificate signing request csr-66mx8 is approved, waiting to be issued Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.264097 4860 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.268530 4860 csr.go:257] certificate signing request csr-66mx8 is issued Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.307531 4860 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.639437 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.728733 4860 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.751836 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.753154 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.753219 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.753269 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.754213 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:55:52 crc kubenswrapper[4860]: E0320 10:55:52.754446 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:52 crc kubenswrapper[4860]: I0320 10:55:52.963080 4860 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 10:55:52 crc kubenswrapper[4860]: W0320 10:55:52.963342 4860 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Mar 20 10:55:53 crc kubenswrapper[4860]: I0320 10:55:53.270972 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-11 20:20:01.884747405 +0000 UTC Mar 20 10:55:53 crc kubenswrapper[4860]: I0320 10:55:53.271026 4860 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6393h24m8.613725185s for next certificate rotation Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.664692 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.666508 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.666584 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.666603 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.666709 4860 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.679597 4860 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.680132 4860 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.680169 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.685210 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.685300 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.685321 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.685350 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.685374 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:55Z","lastTransitionTime":"2026-03-20T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.706546 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.716657 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.716703 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.716717 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.716737 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.716752 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:55Z","lastTransitionTime":"2026-03-20T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.731838 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.743406 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.743459 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.743474 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.743499 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.743516 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:55Z","lastTransitionTime":"2026-03-20T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.762409 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.772948 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.773003 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.773019 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.773040 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:55 crc kubenswrapper[4860]: I0320 10:55:55.773058 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:55Z","lastTransitionTime":"2026-03-20T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.792484 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.792749 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.792800 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.893880 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:55 crc kubenswrapper[4860]: E0320 10:55:55.995007 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.095410 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.196397 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.296906 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.398055 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.498587 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.598891 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.699932 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.800706 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:56 crc kubenswrapper[4860]: E0320 10:55:56.901044 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.001575 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.102326 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.203323 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.303968 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: I0320 10:55:57.374913 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:57 crc kubenswrapper[4860]: I0320 10:55:57.375167 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:57 crc kubenswrapper[4860]: I0320 10:55:57.377338 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:57 crc kubenswrapper[4860]: I0320 10:55:57.377415 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:57 crc kubenswrapper[4860]: I0320 10:55:57.377441 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.404820 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.505610 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.572288 4860 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.606613 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.707734 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.808846 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:57 crc kubenswrapper[4860]: E0320 10:55:57.909949 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.010841 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.111062 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.211271 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.311888 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.413147 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.514108 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.614679 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.715473 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.815793 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:58 crc kubenswrapper[4860]: E0320 10:55:58.916886 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.018029 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.119087 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.220084 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.320907 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.421089 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.521832 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.622929 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.724110 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.825191 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:59 crc kubenswrapper[4860]: E0320 10:55:59.926410 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.027023 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.127304 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.228302 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.329285 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.429975 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.530662 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.631720 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.732433 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.833052 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:00 crc kubenswrapper[4860]: E0320 10:56:00.934301 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.035217 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.136317 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.237389 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.338064 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.438514 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.539459 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.639968 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.741041 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.841540 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:01 crc kubenswrapper[4860]: E0320 10:56:01.941642 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.042146 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.142426 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.243719 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.344697 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: I0320 10:56:02.413066 4860 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:56:02 crc kubenswrapper[4860]: I0320 10:56:02.415561 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:02 crc kubenswrapper[4860]: I0320 10:56:02.415610 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:02 crc kubenswrapper[4860]: I0320 10:56:02.415620 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.445322 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.545952 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.646847 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: I0320 10:56:02.689423 4860 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.747906 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.848355 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:02 crc kubenswrapper[4860]: E0320 10:56:02.949106 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:03 crc kubenswrapper[4860]: E0320 10:56:03.050339 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:03 crc kubenswrapper[4860]: E0320 10:56:03.150602 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:03 crc kubenswrapper[4860]: E0320 10:56:03.251735 4860 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.343013 4860 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.354883 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.354961 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.354988 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.355020 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.355044 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:03Z","lastTransitionTime":"2026-03-20T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.459267 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.459340 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.459358 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.459880 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.459941 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:03Z","lastTransitionTime":"2026-03-20T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.563124 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.563183 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.563196 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.563217 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.563249 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:03Z","lastTransitionTime":"2026-03-20T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.666319 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.666379 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.666402 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.666432 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.666454 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:03Z","lastTransitionTime":"2026-03-20T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.769886 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.769949 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.769967 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.769992 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.770011 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:03Z","lastTransitionTime":"2026-03-20T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.873202 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.873296 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.873319 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.873348 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.873372 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:03Z","lastTransitionTime":"2026-03-20T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.977543 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.977604 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.977614 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.977640 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:03 crc kubenswrapper[4860]: I0320 10:56:03.977652 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:03Z","lastTransitionTime":"2026-03-20T10:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.081661 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.081746 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.081771 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.081795 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.081813 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.185397 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.185473 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.185489 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.185516 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.185548 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.288908 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.288965 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.288976 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.288998 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.289011 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.299084 4860 apiserver.go:52] "Watching apiserver" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.305261 4860 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.305788 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.306344 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.306396 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.306471 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.306690 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.306778 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.307083 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.307561 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.307579 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.307688 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.312128 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.312891 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.320115 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.321661 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.321992 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.322221 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.322477 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.322570 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.322751 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.346525 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.362254 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.367160 4860 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.376054 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.388365 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389693 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389774 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389802 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389834 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389862 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389894 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389924 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389951 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.389979 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390006 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390032 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390115 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390140 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390164 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390189 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390277 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390303 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390332 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390362 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390389 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.391131 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.390243 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.391557 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.391716 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.391752 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.391725 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.391954 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.392596 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.392615 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.392606 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.392841 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.392987 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.393122 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.393166 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.393303 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.393276 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.391504 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.393284 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.393754 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.394120 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.394105 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.394724 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.394505 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.394921 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395341 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395353 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395391 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395471 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395527 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395683 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395598 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395894 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395896 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.396203 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.395938 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.396378 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.397049 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.397083 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.396826 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.397205 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.396550 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.397294 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.397302 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.397737 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398083 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398174 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398199 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398261 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398303 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398344 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398382 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398412 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398443 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398477 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398510 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398549 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398580 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.398608 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.400663 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.400891 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401056 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401092 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401116 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401137 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401157 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401176 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401194 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401213 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401252 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401270 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401291 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401312 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401335 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401353 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401370 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401389 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401408 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401427 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401446 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401470 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401491 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401508 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401638 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401659 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401679 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401704 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401731 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401758 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401781 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401800 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401819 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401837 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401855 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401873 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401891 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401913 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401932 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401952 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401972 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401971 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.401993 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402014 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402033 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402060 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402080 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402099 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402118 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402142 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402161 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402180 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402201 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402245 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402268 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402291 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402317 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402341 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402359 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402375 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402408 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402436 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402457 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402478 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402503 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402529 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402550 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402575 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402590 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402637 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402886 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.402601 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403038 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403074 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403100 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403126 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403149 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403169 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403253 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403282 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403311 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403342 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403363 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403382 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403400 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403418 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403436 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403453 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403473 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403488 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403506 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403522 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403407 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403542 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403563 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403580 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403595 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403613 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403631 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403650 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403672 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403689 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403712 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403732 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403750 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403769 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403786 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403804 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403823 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403818 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403842 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403828 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403871 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403906 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403888 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.403927 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404005 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404070 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404055 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404115 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404157 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404181 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404210 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404241 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404267 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404302 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404332 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404360 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404386 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404412 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404446 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404475 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404501 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404532 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404563 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404595 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404621 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404650 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404664 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404705 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404721 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404743 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404759 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404677 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406419 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406452 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406481 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406509 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406537 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406563 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406589 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406613 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406635 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406655 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406676 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406712 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406735 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406755 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406775 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406813 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406841 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406867 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406892 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406914 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406941 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406971 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406988 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407007 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407046 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407073 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407094 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407128 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407151 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407173 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407193 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407240 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407271 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407296 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407315 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407334 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407465 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407490 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407593 4860 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407612 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407626 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407640 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407655 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407669 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407683 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407696 4860 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407713 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407729 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407746 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407762 4860 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407778 4860 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407793 4860 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407807 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407820 4860 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407834 4860 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407849 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407861 4860 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407873 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407886 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407897 4860 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407910 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407924 4860 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407938 4860 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407951 4860 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407965 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407977 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407991 4860 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408006 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408018 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408034 4860 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408048 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408060 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408074 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408088 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408102 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408125 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408140 4860 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408152 4860 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408167 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408182 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408196 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408210 4860 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408245 4860 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408260 4860 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404261 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404337 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404673 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404725 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404742 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.404861 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.405511 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.405943 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.409398 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406155 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406643 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406669 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406694 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.406789 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407004 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407199 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407214 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407584 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407615 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407690 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.407744 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408078 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408094 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408347 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408410 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408511 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408725 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408786 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.408825 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.409036 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.409130 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.409184 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:04.909149032 +0000 UTC m=+89.130509930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.409728 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410129 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410196 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410209 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410472 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410566 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410663 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410671 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410737 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410933 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410945 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.410989 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411099 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411103 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411154 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411166 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411342 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411665 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411679 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411937 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.411945 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.412676 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413188 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413264 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413293 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413327 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413341 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413386 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413398 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413605 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413633 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413789 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.413790 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.414153 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.414169 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.414298 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.414328 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.414488 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.414630 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.414642 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.414706 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:04.914691181 +0000 UTC m=+89.136052279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.414893 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.415014 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.415025 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.415104 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.415392 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.415462 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.415661 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.415736 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:04.915714979 +0000 UTC m=+89.137075867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.415899 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.416030 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.416269 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.416309 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.416705 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.417165 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.417280 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.417568 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.417676 4860 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.417869 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.417961 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.418129 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.418429 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.418431 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.418595 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.418623 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.418739 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.419199 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.419376 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.419635 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.419803 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.420076 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.420703 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.420993 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.421359 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.421760 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.421943 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.422711 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.422743 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.422782 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.422836 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.423016 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.423317 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.424447 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.424885 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.424733 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.427434 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.429744 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.429876 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.433536 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.434926 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.434956 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.434972 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.435067 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:04.935038757 +0000 UTC m=+89.156399655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.435162 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.435970 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.436138 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.436784 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.436966 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.437206 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.437697 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.437905 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.438368 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.442180 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.443901 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.443931 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.443945 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.444021 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:04.943999597 +0000 UTC m=+89.165360495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.446303 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.446341 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.447716 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.448151 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.448385 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.448473 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.448815 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.448915 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.449379 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.449601 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.449682 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.449710 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.450006 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.450123 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.450370 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.450412 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.450676 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.450760 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.450982 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.451489 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.451617 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.452096 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.452135 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.452127 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.452414 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.452583 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.452917 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.453076 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.453255 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.453650 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.454075 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.460925 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.463880 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.473551 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.474382 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.507841 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.507879 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.507887 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.507902 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.507911 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508642 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508711 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508779 4860 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508809 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508825 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508838 4860 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508850 4860 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508864 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508852 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508892 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.508876 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509004 4860 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509021 4860 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509037 4860 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509051 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509064 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509079 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509094 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509107 4860 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509119 4860 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509132 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509151 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509173 4860 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509186 4860 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509199 4860 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509212 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509250 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509265 4860 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509278 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509291 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509305 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509319 4860 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509333 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509348 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509361 4860 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509374 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509387 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509403 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509417 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509432 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509444 4860 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509458 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509472 4860 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509485 4860 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509497 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509512 4860 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509528 4860 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509546 4860 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509561 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509574 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509610 4860 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509624 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509636 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509648 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509661 4860 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509675 4860 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509688 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509700 4860 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509712 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509724 4860 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509735 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509746 4860 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509759 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509772 4860 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509783 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509797 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509809 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509821 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509833 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509846 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509858 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509871 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509883 4860 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509896 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509908 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509920 4860 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509933 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509945 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509961 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509974 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509987 4860 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.509998 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510009 4860 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510021 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510033 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510045 4860 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510057 4860 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510077 4860 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510088 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510100 4860 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510111 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510123 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510134 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510147 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510159 4860 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510171 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510182 4860 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510193 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510204 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510215 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510251 4860 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510263 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510274 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510287 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510299 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510310 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510321 4860 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510334 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510345 4860 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510357 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510368 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510379 4860 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510390 4860 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510401 4860 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510414 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510427 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510438 4860 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510448 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510459 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510476 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510486 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510497 4860 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510509 4860 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510520 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510530 4860 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510543 4860 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510564 4860 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510574 4860 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510593 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510604 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510617 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510628 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510639 4860 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510651 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510664 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510677 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510691 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510702 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510712 4860 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510724 4860 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510736 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510748 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510760 4860 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510772 4860 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510785 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510798 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510811 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510823 4860 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510838 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510850 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510865 4860 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.510880 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.610805 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.610856 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.610867 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.610889 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.610899 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.626188 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.643132 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.657311 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:04 crc kubenswrapper[4860]: W0320 10:56:04.687844 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-571c75a4815de926687021cf89c33db429d5e84710fed66ad8d963e67f7cf8e1 WatchSource:0}: Error finding container 571c75a4815de926687021cf89c33db429d5e84710fed66ad8d963e67f7cf8e1: Status 404 returned error can't find the container with id 571c75a4815de926687021cf89c33db429d5e84710fed66ad8d963e67f7cf8e1 Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.716313 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.716367 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.716378 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.716397 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.716412 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.787755 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"571c75a4815de926687021cf89c33db429d5e84710fed66ad8d963e67f7cf8e1"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.789342 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"63997ce9f6f9fd2913962a005d896518c8716e997da0a38f4c75591bd1349459"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.790389 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6667f283d81a023cc34d25c0b8cc71760accc6925c987ad4f43175d0ea1ed791"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.819115 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.819151 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.819160 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.819177 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.819187 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.913648 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:04 crc kubenswrapper[4860]: E0320 10:56:04.913809 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:05.91376365 +0000 UTC m=+90.135124548 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.922321 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.922401 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.922425 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.922453 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:04 crc kubenswrapper[4860]: I0320 10:56:04.922472 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:04Z","lastTransitionTime":"2026-03-20T10:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.014751 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.014821 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.014858 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.014897 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.014933 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015012 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:06.014994476 +0000 UTC m=+90.236355374 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015023 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015046 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015058 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015062 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015087 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015091 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:06.015082268 +0000 UTC m=+90.236443166 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015102 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015134 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015162 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:06.01514456 +0000 UTC m=+90.236505478 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.015186 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:06.015174221 +0000 UTC m=+90.236535129 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.024606 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.024651 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.024668 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.024686 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.024698 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.127865 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.127915 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.127926 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.127944 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.127955 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.231396 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.231467 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.231494 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.231530 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.231562 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.335094 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.335159 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.335178 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.335205 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.335257 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.413641 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.413833 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.421871 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.422411 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.423175 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.423812 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.424456 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.424963 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.425618 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.426135 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.426783 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.427290 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.427763 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.428476 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.429039 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.429717 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.430987 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.432809 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.434827 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.436682 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.438068 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.439707 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.439788 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.439804 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.439806 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.439855 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.440059 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.440964 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.441795 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.442400 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.443299 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.443891 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.444534 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.445299 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.445959 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.446827 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.448779 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.449709 4860 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.449892 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.452032 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.452812 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.453768 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.455484 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.456530 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.457365 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.458469 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.460472 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.461214 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.462990 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.463750 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.464855 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.465448 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.466587 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.467447 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.468950 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.469760 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.471002 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.471788 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.472536 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.473851 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.474598 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.543750 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.543803 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.543816 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.543834 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.543848 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.647796 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.647858 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.647876 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.647900 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.647911 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.750731 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.750780 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.750795 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.750814 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.750829 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.795510 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.797928 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.797983 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.812043 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.833144 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.850661 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.854245 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.854291 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.854304 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.854326 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.854343 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.866160 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.880078 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.897553 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.914584 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.923998 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:05 crc kubenswrapper[4860]: E0320 10:56:05.924146 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:07.924115396 +0000 UTC m=+92.145476314 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.930982 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.954030 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.957702 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.957777 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.957791 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.957811 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.957825 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.971043 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4860]: I0320 10:56:05.986075 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.006555 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.024835 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.024895 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.024923 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.024948 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025031 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025071 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025073 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025031 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025113 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025129 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:08.025103245 +0000 UTC m=+92.246464153 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025088 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025148 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025135 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025153 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:08.025142226 +0000 UTC m=+92.246503124 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025255 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:08.025219539 +0000 UTC m=+92.246580447 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.025292 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:08.02527632 +0000 UTC m=+92.246637228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.050132 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.050175 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.050188 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.050206 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.050235 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.064958 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.068358 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.068385 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.068394 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.068408 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.068419 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.084051 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.087691 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.087728 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.087739 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.087755 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.087766 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.102180 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.106511 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.106553 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.106564 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.106584 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.106597 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.133856 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.141696 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.141743 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.141757 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.141778 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.141792 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.194965 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.195090 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.197274 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.197318 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.197332 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.197352 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.197366 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.300395 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.300450 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.300461 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.300480 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.300492 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.402893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.403029 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.403041 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.403069 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.403085 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.413329 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.413456 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.413575 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:06 crc kubenswrapper[4860]: E0320 10:56:06.413642 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.505382 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.505462 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.505473 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.505493 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.505505 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.608325 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.608385 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.608403 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.608425 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.608444 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.710985 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.711024 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.711033 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.711050 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.711060 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.813799 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.813895 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.813913 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.813940 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.813957 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.916870 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.916918 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.916932 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.916953 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:06 crc kubenswrapper[4860]: I0320 10:56:06.916967 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:06Z","lastTransitionTime":"2026-03-20T10:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.021447 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.021513 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.021543 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.021565 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.021580 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.127345 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.127407 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.127424 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.127441 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.127454 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.231185 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.231277 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.231294 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.231313 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.231327 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.334053 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.334111 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.334125 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.334147 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.334166 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.413388 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:07 crc kubenswrapper[4860]: E0320 10:56:07.413769 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.428596 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:56:07 crc kubenswrapper[4860]: E0320 10:56:07.428975 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.430647 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.436488 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.436548 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.436567 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.436592 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.436611 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.439993 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.461852 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.482079 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.503805 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.523956 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.538994 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.539252 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.539407 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.539212 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.539520 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.539571 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.641981 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.642051 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.642061 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.642079 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.642089 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.744656 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.744736 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.744756 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.744784 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.744801 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.806498 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.807157 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:56:07 crc kubenswrapper[4860]: E0320 10:56:07.807372 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.824777 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.844605 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.847335 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.847366 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.847378 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.847393 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.847404 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.859664 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.873757 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.886291 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.897192 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.908928 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.942218 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:07 crc kubenswrapper[4860]: E0320 10:56:07.942546 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:11.942474774 +0000 UTC m=+96.163835712 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.950016 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.950084 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.950109 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.950137 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:07 crc kubenswrapper[4860]: I0320 10:56:07.950156 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:07Z","lastTransitionTime":"2026-03-20T10:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.042812 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.042889 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.042915 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.042939 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.042998 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043019 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043024 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043026 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043041 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043034 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043083 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043086 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043088 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:12.043073693 +0000 UTC m=+96.264434591 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043125 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:12.043113474 +0000 UTC m=+96.264474372 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043142 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:12.043131775 +0000 UTC m=+96.264492673 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.043154 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:12.043147865 +0000 UTC m=+96.264508763 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.052922 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.052954 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.052966 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.052982 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.052995 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.156110 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.156171 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.156183 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.156205 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.156218 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.259766 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.259861 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.259879 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.259903 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.259919 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.363793 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.363856 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.363872 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.363898 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.363910 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.412842 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.412859 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.413098 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:08 crc kubenswrapper[4860]: E0320 10:56:08.413136 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.466514 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.466588 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.466605 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.466628 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.466651 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.569551 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.569604 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.569622 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.569646 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.569664 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.673285 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.673329 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.673340 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.673360 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.673371 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.776055 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.776097 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.776108 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.776126 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.776136 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.879654 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.879706 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.879717 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.879734 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.879756 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.982736 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.982807 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.982830 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.982859 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:08 crc kubenswrapper[4860]: I0320 10:56:08.982880 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:08Z","lastTransitionTime":"2026-03-20T10:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.086423 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.086491 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.086509 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.086534 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.086553 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.189862 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.189927 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.189954 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.189985 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.190007 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.292451 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.292509 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.292524 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.292544 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.292562 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.396944 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.397020 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.397044 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.397077 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.397101 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.413046 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:09 crc kubenswrapper[4860]: E0320 10:56:09.413288 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.499393 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.499462 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.499480 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.499508 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.499527 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.602972 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.603018 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.603030 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.603049 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.603064 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.705856 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.705918 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.705929 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.705949 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.705968 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.808694 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.808735 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.808746 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.808764 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.808777 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.910948 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.910997 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.911005 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.911021 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:09 crc kubenswrapper[4860]: I0320 10:56:09.911030 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:09Z","lastTransitionTime":"2026-03-20T10:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.014009 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.014108 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.014128 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.014154 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.014174 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.116644 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.116706 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.116719 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.116742 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.116757 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.219928 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.220004 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.220019 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.220037 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.220049 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.323032 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.323075 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.323083 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.323098 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.323111 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.412648 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.412709 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:10 crc kubenswrapper[4860]: E0320 10:56:10.412848 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:10 crc kubenswrapper[4860]: E0320 10:56:10.413024 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.426048 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.426122 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.426141 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.426171 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.426190 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.529329 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.529396 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.529409 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.529427 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.529439 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.632156 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.632332 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.632363 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.632394 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.632415 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.737329 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.737427 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.737455 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.737489 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.737525 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.841286 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.841353 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.841370 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.841397 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.841416 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.944346 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.944431 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.944447 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.944472 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:10 crc kubenswrapper[4860]: I0320 10:56:10.944502 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:10Z","lastTransitionTime":"2026-03-20T10:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.047350 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.047486 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.047515 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.047547 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.047570 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.150863 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.150925 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.150961 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.150991 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.151016 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.254291 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.254372 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.254395 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.254425 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.254461 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.356933 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.356994 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.357009 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.357031 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.357045 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.412857 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:11 crc kubenswrapper[4860]: E0320 10:56:11.413452 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.427327 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.460125 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.460171 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.460180 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.460196 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.460207 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.563514 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.563557 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.563566 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.563583 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.563593 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.668168 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.668219 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.668253 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.668273 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.668285 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.771263 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.771367 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.771386 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.771447 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.771466 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.875119 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.875204 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.875260 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.875288 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.875311 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.978898 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.978968 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.978990 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.979017 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.979039 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:11Z","lastTransitionTime":"2026-03-20T10:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:11 crc kubenswrapper[4860]: I0320 10:56:11.980109 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:11 crc kubenswrapper[4860]: E0320 10:56:11.980358 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:19.980320291 +0000 UTC m=+104.201681229 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.080769 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.080834 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.080887 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.080941 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081002 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081124 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081166 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081208 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081379 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081406 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081191 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:20.081173607 +0000 UTC m=+104.302534505 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081465 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:20.081436644 +0000 UTC m=+104.302797552 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081507 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:20.081476345 +0000 UTC m=+104.302837273 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081603 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081623 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.081849 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:20.08166937 +0000 UTC m=+104.303030348 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.082387 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.082480 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.082498 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.082522 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.082538 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.185623 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.185705 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.185719 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.185742 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.185781 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.288462 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.288544 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.288571 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.288604 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.288628 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.391620 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.391662 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.391675 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.391692 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.391703 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.413032 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.413576 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.413704 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:12 crc kubenswrapper[4860]: E0320 10:56:12.414055 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.494795 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.494874 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.494893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.494926 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.494946 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.598246 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.598315 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.598326 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.598360 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.598372 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.701529 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.701600 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.701617 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.701642 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.701659 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.805660 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.805748 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.805766 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.805795 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.805816 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.909614 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.909688 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.909705 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.909736 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:12 crc kubenswrapper[4860]: I0320 10:56:12.909757 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:12Z","lastTransitionTime":"2026-03-20T10:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.012686 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.012773 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.012789 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.012809 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.012824 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.115940 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.116006 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.116022 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.116047 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.116063 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.220193 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.220296 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.220318 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.220344 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.220362 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.323811 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.323887 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.323910 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.323942 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.323966 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.413361 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:13 crc kubenswrapper[4860]: E0320 10:56:13.413547 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.426068 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.426133 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.426145 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.426180 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.426194 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.529031 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.529261 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.529302 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.529338 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.529361 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.632163 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.632326 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.632363 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.632414 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.632439 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.734979 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.735024 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.735032 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.735051 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.735063 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.837608 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.837663 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.837673 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.837687 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.837697 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.940578 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.940630 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.940643 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.940668 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:13 crc kubenswrapper[4860]: I0320 10:56:13.940684 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:13Z","lastTransitionTime":"2026-03-20T10:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.042688 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.042739 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.042752 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.042773 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.042785 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.147289 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.147340 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.147353 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.147372 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.147386 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.250122 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.250175 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.250187 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.250204 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.250217 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.353302 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.353373 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.353390 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.353416 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.353434 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.413252 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.413371 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:14 crc kubenswrapper[4860]: E0320 10:56:14.413714 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:14 crc kubenswrapper[4860]: E0320 10:56:14.413951 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.457394 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.457811 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.457893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.457972 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.458036 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.561103 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.561152 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.561167 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.561190 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.561209 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.663744 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.663793 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.663803 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.663823 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.663835 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.767362 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.767403 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.767415 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.767432 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.767448 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.869588 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.869620 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.869629 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.869643 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.869654 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.972915 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.972986 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.973006 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.973034 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:14 crc kubenswrapper[4860]: I0320 10:56:14.973055 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:14Z","lastTransitionTime":"2026-03-20T10:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.076632 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.076706 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.076749 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.076787 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.076812 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.179432 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.179468 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.179478 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.179503 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.179516 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.283084 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.283134 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.283152 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.283179 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.283198 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.387050 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.387120 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.387131 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.387150 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.387163 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.412449 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:15 crc kubenswrapper[4860]: E0320 10:56:15.412654 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.490110 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.490191 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.490202 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.490240 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.490260 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.592614 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.592690 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.592709 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.592738 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.592759 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.695500 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.695550 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.695564 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.695586 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.695602 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.798714 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.798780 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.798798 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.798824 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.798842 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.901683 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.901742 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.901760 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.901789 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4860]: I0320 10:56:15.901806 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.004906 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.004937 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.004945 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.004960 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.004969 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.108156 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.108282 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.108309 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.108343 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.108368 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.212023 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.212133 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.212184 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.212218 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.212282 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.312887 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.312955 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.312979 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.313009 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.313030 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.338471 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.345945 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.346019 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.346039 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.346065 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.346086 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.381379 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.389386 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.389459 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.389483 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.389516 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.389541 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.412524 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.412522 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.412731 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.412870 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.424483 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.429431 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.429503 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.429517 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.429540 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.429566 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.445260 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.449146 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.449201 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.449215 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.449249 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.449263 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.464807 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:16 crc kubenswrapper[4860]: E0320 10:56:16.465260 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.468597 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.468654 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.468667 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.468687 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.468699 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.571753 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.571823 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.571840 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.571864 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.571879 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.674851 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.674906 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.674923 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.674945 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.674962 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.778373 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.778457 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.778491 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.778522 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.778540 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.882074 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.882123 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.882139 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.882158 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.882171 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.985495 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.985578 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.985592 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.985617 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:16 crc kubenswrapper[4860]: I0320 10:56:16.985633 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:16Z","lastTransitionTime":"2026-03-20T10:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.089038 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.089094 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.089107 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.089127 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.089142 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.191693 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.191805 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.191860 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.191891 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.191911 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.295393 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.295477 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.295500 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.295529 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.295549 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.399181 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.399304 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.399384 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.399416 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.399438 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.412916 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:17 crc kubenswrapper[4860]: E0320 10:56:17.413279 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.437296 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.459079 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.482666 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.502663 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.502732 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.502742 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.502759 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.502770 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.508585 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.533793 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.554052 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.574149 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.590218 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.605552 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.605617 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.605634 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.605656 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.605670 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.709479 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.709536 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.709547 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.709567 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.709589 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.813340 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.813427 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.813445 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.813471 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.813489 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.917768 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.917837 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.917853 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.917888 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:17 crc kubenswrapper[4860]: I0320 10:56:17.917905 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:17Z","lastTransitionTime":"2026-03-20T10:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.020775 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.020822 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.020840 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.020865 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.020882 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.124925 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.124983 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.124996 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.125017 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.125031 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.228799 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.228859 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.228872 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.228893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.228907 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.332324 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.332397 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.332410 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.332432 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.332451 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.412656 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.412656 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:18 crc kubenswrapper[4860]: E0320 10:56:18.412805 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:18 crc kubenswrapper[4860]: E0320 10:56:18.413084 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.435837 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.435889 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.435907 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.435936 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.435949 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.538507 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.538662 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.538690 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.538713 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.538730 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.642009 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.642064 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.642083 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.642108 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.642126 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.746173 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.746271 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.746295 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.746328 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.746349 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.848945 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.848978 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.848987 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.849001 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.849009 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.951360 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.951468 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.951493 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.951525 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:18 crc kubenswrapper[4860]: I0320 10:56:18.951552 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:18Z","lastTransitionTime":"2026-03-20T10:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.054539 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.054582 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.054593 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.054612 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.054624 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.157455 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.157505 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.157516 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.157536 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.157548 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.260195 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.260948 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.260966 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.260984 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.260996 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.364159 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.364209 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.364219 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.364248 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.364259 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.413608 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:19 crc kubenswrapper[4860]: E0320 10:56:19.413882 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.414083 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:56:19 crc kubenswrapper[4860]: E0320 10:56:19.414281 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.466715 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.466776 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.466787 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.466807 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.466819 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.569733 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.569774 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.569785 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.569801 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.569813 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.673054 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.673155 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.673181 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.673211 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.673277 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.775791 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.775844 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.775858 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.775877 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.775892 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.878189 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.878281 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.878294 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.878312 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.878323 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.980940 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.980989 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.981000 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.981018 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:19 crc kubenswrapper[4860]: I0320 10:56:19.981029 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:19Z","lastTransitionTime":"2026-03-20T10:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.058709 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.058873 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:36.058854582 +0000 UTC m=+120.280215480 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.084649 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.084700 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.084711 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.084730 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.084741 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.160219 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.160299 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.160334 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.160366 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160423 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160459 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160468 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160485 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160492 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160500 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160503 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160502 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:36.160480379 +0000 UTC m=+120.381841277 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160466 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160548 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:36.16053703 +0000 UTC m=+120.381897938 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160570 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:36.160558671 +0000 UTC m=+120.381919569 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.160588 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:36.160580021 +0000 UTC m=+120.381940919 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.187515 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.187552 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.187561 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.187575 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.187584 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.290071 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.290108 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.290120 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.290134 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.290144 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.393434 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.393504 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.393520 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.393550 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.393564 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.413273 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.413566 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.413289 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:20 crc kubenswrapper[4860]: E0320 10:56:20.413981 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.496198 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.496488 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.496579 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.496713 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.496794 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.599439 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.599514 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.599531 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.599556 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.599575 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.702458 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.702521 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.702531 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.702548 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.702558 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.805528 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.805817 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.805889 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.805995 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.806142 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.909433 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.909492 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.909507 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.909531 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:20 crc kubenswrapper[4860]: I0320 10:56:20.909549 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:20Z","lastTransitionTime":"2026-03-20T10:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.012709 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.012769 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.012779 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.012796 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.012807 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.115451 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.115541 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.115557 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.115585 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.115602 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.219183 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.219260 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.219272 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.219294 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.219309 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.322339 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.322387 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.322398 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.322417 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.322430 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.413316 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:21 crc kubenswrapper[4860]: E0320 10:56:21.413520 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.425036 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.425079 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.425089 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.425105 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.425116 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.527644 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.527702 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.527713 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.527730 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.527744 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.630029 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.630086 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.630095 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.630112 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.630122 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.642084 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-srbpg"] Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.642483 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.644806 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.645175 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.646870 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.657670 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.671084 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.684217 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.696832 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.708413 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.721135 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.732672 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.732761 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.732780 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.732808 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.732840 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.733772 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.758011 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.774861 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.778285 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml6nd\" (UniqueName: \"kubernetes.io/projected/93e597b5-a377-4988-8c59-eeace5ffa4e4-kube-api-access-ml6nd\") pod \"node-resolver-srbpg\" (UID: \"93e597b5-a377-4988-8c59-eeace5ffa4e4\") " pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.778369 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/93e597b5-a377-4988-8c59-eeace5ffa4e4-hosts-file\") pod \"node-resolver-srbpg\" (UID: \"93e597b5-a377-4988-8c59-eeace5ffa4e4\") " pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.835821 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.835903 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.835924 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.835946 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.835966 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.879501 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml6nd\" (UniqueName: \"kubernetes.io/projected/93e597b5-a377-4988-8c59-eeace5ffa4e4-kube-api-access-ml6nd\") pod \"node-resolver-srbpg\" (UID: \"93e597b5-a377-4988-8c59-eeace5ffa4e4\") " pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.879611 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/93e597b5-a377-4988-8c59-eeace5ffa4e4-hosts-file\") pod \"node-resolver-srbpg\" (UID: \"93e597b5-a377-4988-8c59-eeace5ffa4e4\") " pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.879752 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/93e597b5-a377-4988-8c59-eeace5ffa4e4-hosts-file\") pod \"node-resolver-srbpg\" (UID: \"93e597b5-a377-4988-8c59-eeace5ffa4e4\") " pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.900498 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml6nd\" (UniqueName: \"kubernetes.io/projected/93e597b5-a377-4988-8c59-eeace5ffa4e4-kube-api-access-ml6nd\") pod \"node-resolver-srbpg\" (UID: \"93e597b5-a377-4988-8c59-eeace5ffa4e4\") " pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.940070 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.940129 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.940143 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.940163 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.940179 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:21Z","lastTransitionTime":"2026-03-20T10:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:21 crc kubenswrapper[4860]: I0320 10:56:21.955429 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-srbpg" Mar 20 10:56:21 crc kubenswrapper[4860]: W0320 10:56:21.971837 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93e597b5_a377_4988_8c59_eeace5ffa4e4.slice/crio-09389447edc5810169163731be1f52e1fae6e57d63d9deda32883a0652640d03 WatchSource:0}: Error finding container 09389447edc5810169163731be1f52e1fae6e57d63d9deda32883a0652640d03: Status 404 returned error can't find the container with id 09389447edc5810169163731be1f52e1fae6e57d63d9deda32883a0652640d03 Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.024672 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-kvdqp"] Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.025125 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.027819 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.028191 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.028444 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-cmc44"] Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.028460 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.028603 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.028621 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.028800 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-wpj5w"] Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.029103 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.029371 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.030818 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.031047 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.031284 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.031332 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.031796 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.031983 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.032329 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.043488 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.043518 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.043529 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.043542 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.043553 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.043666 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.058155 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.070829 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.091617 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.106818 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.121907 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.137491 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.145906 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.145948 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.145958 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.145973 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.145986 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.150212 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.164839 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.179727 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182156 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-system-cni-dir\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182219 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-os-release\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182328 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/329ab518-a391-4483-8373-1329318b58da-cni-binary-copy\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182361 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6a9df230-75a1-4b64-8d00-c179e9c19080-mcd-auth-proxy-config\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182386 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhhqf\" (UniqueName: \"kubernetes.io/projected/6a9df230-75a1-4b64-8d00-c179e9c19080-kube-api-access-vhhqf\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182411 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/329ab518-a391-4483-8373-1329318b58da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182446 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a9df230-75a1-4b64-8d00-c179e9c19080-proxy-tls\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182477 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6a9df230-75a1-4b64-8d00-c179e9c19080-rootfs\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182511 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-cnibin\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182543 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-system-cni-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182570 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-multus-certs\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182633 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-k8s-cni-cncf-io\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182664 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-cnibin\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182723 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w22pp\" (UniqueName: \"kubernetes.io/projected/a89c8af2-338f-401f-aad5-c6d7763a3b3a-kube-api-access-w22pp\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182759 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a89c8af2-338f-401f-aad5-c6d7763a3b3a-cni-binary-copy\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182785 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-hostroot\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182817 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-daemon-config\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.182938 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-netns\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183052 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-conf-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183114 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183153 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-cni-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183184 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-socket-dir-parent\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183211 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-cni-multus\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183281 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-kubelet\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183310 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-os-release\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183337 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-242fz\" (UniqueName: \"kubernetes.io/projected/329ab518-a391-4483-8373-1329318b58da-kube-api-access-242fz\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183370 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-etc-kubernetes\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.183420 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-cni-bin\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.197145 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.214905 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.232694 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.248699 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.248739 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.248758 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.248776 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.248788 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.249992 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.273161 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284346 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-cni-multus\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284397 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-kubelet\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284418 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-os-release\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284445 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-242fz\" (UniqueName: \"kubernetes.io/projected/329ab518-a391-4483-8373-1329318b58da-kube-api-access-242fz\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284469 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-etc-kubernetes\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284494 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-cni-bin\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284514 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-system-cni-dir\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284536 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-os-release\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284533 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-cni-multus\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284555 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/329ab518-a391-4483-8373-1329318b58da-cni-binary-copy\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284658 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-kubelet\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284737 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-etc-kubernetes\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284773 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-os-release\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284748 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-system-cni-dir\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284780 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-os-release\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284806 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a9df230-75a1-4b64-8d00-c179e9c19080-proxy-tls\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284843 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-var-lib-cni-bin\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284871 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6a9df230-75a1-4b64-8d00-c179e9c19080-mcd-auth-proxy-config\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284906 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhhqf\" (UniqueName: \"kubernetes.io/projected/6a9df230-75a1-4b64-8d00-c179e9c19080-kube-api-access-vhhqf\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284923 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/329ab518-a391-4483-8373-1329318b58da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284979 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6a9df230-75a1-4b64-8d00-c179e9c19080-rootfs\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.284999 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-cnibin\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285020 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-system-cni-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285037 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-multus-certs\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285055 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-k8s-cni-cncf-io\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285074 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-cnibin\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285090 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w22pp\" (UniqueName: \"kubernetes.io/projected/a89c8af2-338f-401f-aad5-c6d7763a3b3a-kube-api-access-w22pp\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285088 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-cnibin\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285114 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a89c8af2-338f-401f-aad5-c6d7763a3b3a-cni-binary-copy\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285125 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-multus-certs\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285139 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6a9df230-75a1-4b64-8d00-c179e9c19080-rootfs\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285156 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-hostroot\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285131 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-hostroot\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285185 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-daemon-config\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285209 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285240 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-cnibin\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285285 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-netns\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285259 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-netns\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285353 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-conf-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285390 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-cni-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285273 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-host-run-k8s-cni-cncf-io\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285420 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-socket-dir-parent\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285446 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-conf-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285514 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-socket-dir-parent\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285581 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-system-cni-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285608 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-cni-dir\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285969 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a89c8af2-338f-401f-aad5-c6d7763a3b3a-multus-daemon-config\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.285968 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a89c8af2-338f-401f-aad5-c6d7763a3b3a-cni-binary-copy\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.286045 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/329ab518-a391-4483-8373-1329318b58da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.286160 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/329ab518-a391-4483-8373-1329318b58da-cni-binary-copy\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.286591 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/329ab518-a391-4483-8373-1329318b58da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.289429 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6a9df230-75a1-4b64-8d00-c179e9c19080-mcd-auth-proxy-config\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.289541 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a9df230-75a1-4b64-8d00-c179e9c19080-proxy-tls\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.289888 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.303142 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.303602 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-242fz\" (UniqueName: \"kubernetes.io/projected/329ab518-a391-4483-8373-1329318b58da-kube-api-access-242fz\") pod \"multus-additional-cni-plugins-wpj5w\" (UID: \"329ab518-a391-4483-8373-1329318b58da\") " pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.305750 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhhqf\" (UniqueName: \"kubernetes.io/projected/6a9df230-75a1-4b64-8d00-c179e9c19080-kube-api-access-vhhqf\") pod \"machine-config-daemon-kvdqp\" (UID: \"6a9df230-75a1-4b64-8d00-c179e9c19080\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.305922 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w22pp\" (UniqueName: \"kubernetes.io/projected/a89c8af2-338f-401f-aad5-c6d7763a3b3a-kube-api-access-w22pp\") pod \"multus-cmc44\" (UID: \"a89c8af2-338f-401f-aad5-c6d7763a3b3a\") " pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.319099 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.334441 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.343412 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.349874 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cmc44" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.352005 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.352050 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.352065 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.352087 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.352101 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.353857 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: W0320 10:56:22.355028 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a9df230_75a1_4b64_8d00_c179e9c19080.slice/crio-196bd2b259b76529f9465763d2b9768b290464157dfac8f795e04bd6cd98a4a3 WatchSource:0}: Error finding container 196bd2b259b76529f9465763d2b9768b290464157dfac8f795e04bd6cd98a4a3: Status 404 returned error can't find the container with id 196bd2b259b76529f9465763d2b9768b290464157dfac8f795e04bd6cd98a4a3 Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.355924 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" Mar 20 10:56:22 crc kubenswrapper[4860]: W0320 10:56:22.366572 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda89c8af2_338f_401f_aad5_c6d7763a3b3a.slice/crio-87718fcea05d1a2abf6fd28ad257ff7c1a08c5fa78b0fee32a770f100ba3e177 WatchSource:0}: Error finding container 87718fcea05d1a2abf6fd28ad257ff7c1a08c5fa78b0fee32a770f100ba3e177: Status 404 returned error can't find the container with id 87718fcea05d1a2abf6fd28ad257ff7c1a08c5fa78b0fee32a770f100ba3e177 Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.367811 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: W0320 10:56:22.377285 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod329ab518_a391_4483_8373_1329318b58da.slice/crio-86a469bdc9a3d9e601f368890023052fd83e84c1a16211f91ad47e47cda099e9 WatchSource:0}: Error finding container 86a469bdc9a3d9e601f368890023052fd83e84c1a16211f91ad47e47cda099e9: Status 404 returned error can't find the container with id 86a469bdc9a3d9e601f368890023052fd83e84c1a16211f91ad47e47cda099e9 Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.383790 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.406111 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nbkmw"] Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.407096 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.409382 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.411370 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.411438 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.411567 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.411378 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.412014 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.412154 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.412357 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:22 crc kubenswrapper[4860]: E0320 10:56:22.412446 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.414109 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:22 crc kubenswrapper[4860]: E0320 10:56:22.414286 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.428034 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.445578 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.454888 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.454950 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.454966 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.454992 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.455006 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.460004 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.475329 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.490116 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.505445 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.521008 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.537165 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.558314 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.558359 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.558370 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.558387 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.558398 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.560118 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.576019 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589165 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-var-lib-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589155 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589210 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-etc-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589357 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-kubelet\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589377 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-node-log\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589404 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589424 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovn-node-metrics-cert\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589503 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-ovn\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589536 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-script-lib\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589558 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9btp\" (UniqueName: \"kubernetes.io/projected/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-kube-api-access-l9btp\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589599 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-config\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589614 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-env-overrides\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589631 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-slash\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589665 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589860 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589887 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-bin\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589916 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-netns\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589938 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-netd\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589967 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-systemd\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.589989 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-log-socket\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.590017 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-systemd-units\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.603709 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.621027 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.660933 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.660992 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.661002 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.661021 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.661031 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691392 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-etc-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691435 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-kubelet\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691461 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-node-log\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691482 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691498 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-etc-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691508 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovn-node-metrics-cert\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691529 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-ovn\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691548 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-script-lib\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691560 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-node-log\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691571 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9btp\" (UniqueName: \"kubernetes.io/projected/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-kube-api-access-l9btp\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691623 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-config\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691647 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-env-overrides\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691672 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-slash\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691660 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-kubelet\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691724 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691722 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691692 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691854 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-slash\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691880 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-ovn\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.691934 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692052 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692085 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-bin\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692200 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-bin\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692296 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-netns\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692884 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-netd\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692915 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-systemd\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692944 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-netd\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692937 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-log-socket\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692759 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-script-lib\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692666 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-env-overrides\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692994 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-systemd-units\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.693012 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-log-socket\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692778 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-config\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692987 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-systemd\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.692318 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-netns\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.693043 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-systemd-units\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.693090 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-var-lib-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.693126 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-var-lib-openvswitch\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.696717 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovn-node-metrics-cert\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.710513 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9btp\" (UniqueName: \"kubernetes.io/projected/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-kube-api-access-l9btp\") pod \"ovnkube-node-nbkmw\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.740111 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.763332 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.763373 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.763382 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.763400 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.763412 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: W0320 10:56:22.772825 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb85f6f9_1c0f_4388_9464_25dfe48d8d0f.slice/crio-f02c2b7cd3b6fd372807cb3bef1e94c63376211be0ef5968a6850857b9722fe7 WatchSource:0}: Error finding container f02c2b7cd3b6fd372807cb3bef1e94c63376211be0ef5968a6850857b9722fe7: Status 404 returned error can't find the container with id f02c2b7cd3b6fd372807cb3bef1e94c63376211be0ef5968a6850857b9722fe7 Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.854612 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerStarted","Data":"ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.854722 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerStarted","Data":"86a469bdc9a3d9e601f368890023052fd83e84c1a16211f91ad47e47cda099e9"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.856540 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerStarted","Data":"b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.856574 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerStarted","Data":"87718fcea05d1a2abf6fd28ad257ff7c1a08c5fa78b0fee32a770f100ba3e177"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.858272 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"f02c2b7cd3b6fd372807cb3bef1e94c63376211be0ef5968a6850857b9722fe7"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.860089 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.860181 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.860198 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"196bd2b259b76529f9465763d2b9768b290464157dfac8f795e04bd6cd98a4a3"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.863860 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-srbpg" event={"ID":"93e597b5-a377-4988-8c59-eeace5ffa4e4","Type":"ContainerStarted","Data":"299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.863977 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-srbpg" event={"ID":"93e597b5-a377-4988-8c59-eeace5ffa4e4","Type":"ContainerStarted","Data":"09389447edc5810169163731be1f52e1fae6e57d63d9deda32883a0652640d03"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.865557 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.865602 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.865612 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.865631 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.865644 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.872646 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.888829 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.901031 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.913995 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.932396 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.957571 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.968307 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.968361 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.968373 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.968391 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.968403 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:22Z","lastTransitionTime":"2026-03-20T10:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.972306 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.984417 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4860]: I0320 10:56:22.995621 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.010384 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.027178 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.043957 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.060998 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.070802 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.070877 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.070893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.070917 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.070932 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.077059 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.094193 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.111881 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.126715 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.144106 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.169437 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.173458 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.173497 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.173505 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.173520 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.173530 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.189881 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.211415 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.232197 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.247022 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.268767 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.275442 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.275476 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.275485 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.275499 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.275509 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.287076 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.301095 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.378335 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.378394 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.378408 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.378431 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.378446 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.413500 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:23 crc kubenswrapper[4860]: E0320 10:56:23.413635 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.481571 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.481616 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.481626 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.481643 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.481654 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.584988 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.585048 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.585061 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.585087 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.585106 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.688621 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.688673 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.688685 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.688703 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.688713 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.791633 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.791681 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.791694 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.791713 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.791727 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.871362 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11" exitCode=0 Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.871457 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.873828 4860 generic.go:334] "Generic (PLEG): container finished" podID="329ab518-a391-4483-8373-1329318b58da" containerID="ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d" exitCode=0 Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.873886 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerDied","Data":"ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.892649 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.897725 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.897807 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.897825 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.897893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.897912 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:23Z","lastTransitionTime":"2026-03-20T10:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.917890 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.935358 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.952736 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.966661 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4860]: I0320 10:56:23.981836 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.001168 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.001213 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.001277 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.001301 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.001317 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.006193 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.030776 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.045913 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.059619 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.072475 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.087944 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.105311 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.105354 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.105438 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.105470 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.105485 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.108431 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.125074 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.139977 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.153765 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.171440 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.194110 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.208844 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.208882 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.208891 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.208911 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.208925 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.212219 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.236437 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.250706 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.265983 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.280547 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.293629 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.308980 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.311205 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.311268 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.311281 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.311298 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.311309 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.326965 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.412314 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.412330 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:24 crc kubenswrapper[4860]: E0320 10:56:24.412454 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:24 crc kubenswrapper[4860]: E0320 10:56:24.412974 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.413860 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.413888 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.413899 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.413914 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.413926 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.517475 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.517842 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.517852 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.517868 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.517880 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.620534 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.620578 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.620591 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.620609 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.620624 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.730772 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.730824 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.730849 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.730872 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.730887 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.837471 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.837533 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.837546 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.837568 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.837583 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.879346 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerStarted","Data":"b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.884002 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.884062 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.884072 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.904324 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.920299 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.934523 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.940654 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.940813 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.940930 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.941095 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.941270 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.949345 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.968541 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:24 crc kubenswrapper[4860]: I0320 10:56:24.985853 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.001827 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.015363 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.037629 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.043662 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.043699 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.043710 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.043727 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.043739 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.059583 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.075021 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.090938 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.115307 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.146675 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.146720 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.146735 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.146756 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.146770 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.249181 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.249269 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.249283 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.249306 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.249326 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.352113 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.352164 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.352175 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.352194 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.352206 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.413539 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:25 crc kubenswrapper[4860]: E0320 10:56:25.413810 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.454654 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.454701 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.454711 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.454730 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.454746 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.557608 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.557660 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.557673 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.557692 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.557707 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.660466 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.660508 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.660517 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.660534 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.660545 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.763261 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.763350 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.763362 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.763385 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.763401 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.866903 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.866956 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.866965 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.866981 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.866991 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.900614 4860 generic.go:334] "Generic (PLEG): container finished" podID="329ab518-a391-4483-8373-1329318b58da" containerID="b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a" exitCode=0 Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.900704 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerDied","Data":"b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.906991 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.907043 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.907056 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.919365 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.936839 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.958838 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.969571 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.969621 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.969637 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.969658 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.969675 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.982790 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4860]: I0320 10:56:25.995965 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.008729 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.027431 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.043610 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.062385 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.071826 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.071863 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.071872 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.071892 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.071907 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.077159 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.091745 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.107650 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.122134 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.175897 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.175962 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.175974 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.175995 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.176010 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.279278 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.279336 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.279351 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.279372 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.279387 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.382074 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.382137 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.382149 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.382172 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.382192 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.413449 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.413449 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.413636 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.413669 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.485830 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.485914 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.485932 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.485952 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.485965 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.514019 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.514078 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.514095 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.514121 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.514139 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.531312 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.537206 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.537285 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.537298 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.537322 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.537336 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.554042 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.559164 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.559246 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.559261 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.559285 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.559304 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.574492 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.579185 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.579256 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.579269 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.579290 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.579311 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.592563 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.597066 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.597111 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.597124 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.597146 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.597158 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.610883 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: E0320 10:56:26.611017 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.613377 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.613421 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.613438 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.613460 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.613472 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.716181 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.716254 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.716270 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.716288 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.716301 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.819383 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.819469 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.819492 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.819526 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.819549 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.914997 4860 generic.go:334] "Generic (PLEG): container finished" podID="329ab518-a391-4483-8373-1329318b58da" containerID="8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0" exitCode=0 Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.915073 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerDied","Data":"8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.922394 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.922466 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.922487 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.922516 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.922536 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.933847 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.951070 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.971805 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4860]: I0320 10:56:26.987202 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.008249 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.023942 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.031102 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.031157 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.031276 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.031307 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.031329 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.047152 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.061398 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.072734 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.084289 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.097568 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.110524 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.123869 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.133530 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.133685 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.133993 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.134200 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.134429 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.237497 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.237782 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.237860 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.237926 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.237987 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.341180 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.341278 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.341294 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.341319 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.341335 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.412882 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:27 crc kubenswrapper[4860]: E0320 10:56:27.413073 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.433117 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.444921 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.444983 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.445003 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.445030 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.445088 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.457172 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.477046 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.490762 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.506245 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.524720 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.540501 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.547950 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.548019 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.548038 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.548063 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.548083 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.562880 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.582960 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.598850 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.612730 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.627229 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.649789 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.650324 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.650380 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.650394 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.650417 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.650430 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.759080 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.759206 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.759258 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.759284 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.759345 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.861957 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.862018 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.862029 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.862051 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.862064 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.923017 4860 generic.go:334] "Generic (PLEG): container finished" podID="329ab518-a391-4483-8373-1329318b58da" containerID="79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52" exitCode=0 Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.923384 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerDied","Data":"79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.947520 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.965578 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.965642 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.965658 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.965673 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.965683 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:27Z","lastTransitionTime":"2026-03-20T10:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.966125 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.978852 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:27 crc kubenswrapper[4860]: I0320 10:56:27.992668 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.011929 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.028943 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.050688 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.069463 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.069552 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.069561 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.069670 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.069632 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.069689 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.089131 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.106037 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.127581 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.152428 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.172773 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.172828 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.172838 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.172857 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.172868 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.177414 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.276370 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.276422 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.276435 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.276454 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.276465 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.356398 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-tggrc"] Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.356907 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.360292 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.361382 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.361408 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.362295 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.378572 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.379095 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.379138 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.379148 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.379170 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.379184 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.394121 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.409024 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.412837 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.412913 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:28 crc kubenswrapper[4860]: E0320 10:56:28.412975 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:28 crc kubenswrapper[4860]: E0320 10:56:28.413073 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.421794 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.442866 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.458785 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-host\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.458836 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rv72\" (UniqueName: \"kubernetes.io/projected/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-kube-api-access-2rv72\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.458948 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-serviceca\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.462275 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.476947 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.485768 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.485817 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.485829 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.485850 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.485860 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.489885 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.502283 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.514586 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.535615 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.555519 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.559858 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-serviceca\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.559925 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-host\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.559953 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rv72\" (UniqueName: \"kubernetes.io/projected/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-kube-api-access-2rv72\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.560018 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-host\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.561839 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-serviceca\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.579477 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.589431 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.589456 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.589465 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.589479 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.589489 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.590190 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:28Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.594289 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rv72\" (UniqueName: \"kubernetes.io/projected/77d9c3a3-4ed8-43ec-bb4a-fc1d49784105-kube-api-access-2rv72\") pod \"node-ca-tggrc\" (UID: \"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\") " pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.674750 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tggrc" Mar 20 10:56:28 crc kubenswrapper[4860]: W0320 10:56:28.688081 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77d9c3a3_4ed8_43ec_bb4a_fc1d49784105.slice/crio-26f6d6a21944429d1371a570a335ec46703ae880f11acff94007fb6734a28654 WatchSource:0}: Error finding container 26f6d6a21944429d1371a570a335ec46703ae880f11acff94007fb6734a28654: Status 404 returned error can't find the container with id 26f6d6a21944429d1371a570a335ec46703ae880f11acff94007fb6734a28654 Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.691175 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.691208 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.691220 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.691240 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.691271 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.794098 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.794140 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.794152 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.794170 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.794182 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.898397 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.898451 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.898466 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.898486 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.898501 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:28Z","lastTransitionTime":"2026-03-20T10:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.928162 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tggrc" event={"ID":"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105","Type":"ContainerStarted","Data":"26f6d6a21944429d1371a570a335ec46703ae880f11acff94007fb6734a28654"} Mar 20 10:56:28 crc kubenswrapper[4860]: I0320 10:56:28.932053 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.001522 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.001561 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.001572 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.001588 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.001600 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.104814 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.104856 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.104865 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.104881 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.104894 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.206988 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.207042 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.207055 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.207088 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.207098 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.309808 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.309853 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.309865 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.309882 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.309893 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.329208 4860 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.412396 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:29 crc kubenswrapper[4860]: E0320 10:56:29.412558 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.412964 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.412984 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.412992 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.413009 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.413021 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.515328 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.515393 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.515411 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.515437 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.515458 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.540606 4860 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.618978 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.619562 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.619579 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.619599 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.619613 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.721893 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.721974 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.721996 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.722024 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.722043 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.825881 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.825933 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.825943 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.825965 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.825979 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.929227 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.929292 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.929304 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.929322 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.929333 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:29Z","lastTransitionTime":"2026-03-20T10:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.936286 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tggrc" event={"ID":"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105","Type":"ContainerStarted","Data":"5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.942634 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.943092 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.943353 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.948083 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerStarted","Data":"b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef"} Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.955153 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:29Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.972260 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:29Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:29 crc kubenswrapper[4860]: I0320 10:56:29.994121 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:29Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.013263 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.022182 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.031732 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.032278 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.032328 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.032340 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.032363 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.032711 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.045936 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.057217 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.070987 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.085345 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.100941 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.121018 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.135097 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.135137 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.135146 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.135162 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.135173 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.136693 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.149788 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.161040 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.177868 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.203841 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.221683 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.233660 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.237757 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.237791 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.237801 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.237817 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.237828 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.247238 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.261174 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.274924 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.288367 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.301838 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.318631 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.334124 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.340589 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.340629 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.340641 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.340660 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.340676 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.351148 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.368668 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.388819 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.413465 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.413548 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:30 crc kubenswrapper[4860]: E0320 10:56:30.413675 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:30 crc kubenswrapper[4860]: E0320 10:56:30.413959 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.443928 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.443984 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.443999 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.444020 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.444035 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.546769 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.546814 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.546826 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.546844 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.546857 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.652733 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.652800 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.652816 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.652845 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.652858 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.756059 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.756120 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.756132 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.756151 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.756164 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.859386 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.859440 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.859453 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.859472 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.859486 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.951921 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.961937 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.961976 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.961989 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.962008 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.962020 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:30Z","lastTransitionTime":"2026-03-20T10:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.980082 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:30 crc kubenswrapper[4860]: I0320 10:56:30.994762 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:30Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.005132 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.017700 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.030405 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.047020 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.065520 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.065587 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.065601 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.065624 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.065638 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.069320 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.083869 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.095843 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.106543 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.119973 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.133071 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.153634 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.167443 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.167966 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.168009 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.168021 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.168041 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.168054 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.179673 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.270482 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.270561 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.270574 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.270616 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.270628 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.374517 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.374576 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.374590 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.374618 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.374636 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.412567 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:31 crc kubenswrapper[4860]: E0320 10:56:31.412895 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.413199 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.478380 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.478432 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.478442 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.478459 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.478472 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.581096 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.581621 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.581634 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.581652 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.581662 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.684762 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.684803 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.684814 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.684835 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.684856 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.787545 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.787588 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.787600 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.787643 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.787658 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.896906 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.896975 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.897001 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.897027 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.897043 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:31Z","lastTransitionTime":"2026-03-20T10:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.957435 4860 generic.go:334] "Generic (PLEG): container finished" podID="329ab518-a391-4483-8373-1329318b58da" containerID="b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef" exitCode=0 Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.958365 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerDied","Data":"b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef"} Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.975915 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:31 crc kubenswrapper[4860]: I0320 10:56:31.993641 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:31Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.008043 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.008088 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.008103 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.008122 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.008136 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.020957 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.037453 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.050825 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.066752 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.082796 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.100345 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.113031 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.113077 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.113089 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.113108 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.113121 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.113925 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.132081 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.149487 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.164719 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.175782 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.192832 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.216329 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.216395 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.216407 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.216427 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.216438 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.319859 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.319894 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.319903 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.319921 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.319932 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.413382 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.413405 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:32 crc kubenswrapper[4860]: E0320 10:56:32.413562 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:32 crc kubenswrapper[4860]: E0320 10:56:32.413727 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.422611 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.422661 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.422680 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.422706 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.422724 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.525327 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.525370 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.525383 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.525401 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.525413 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.627530 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.627570 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.627580 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.627597 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.627607 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.730288 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.730321 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.730330 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.730346 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.730358 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.833415 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.833468 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.833478 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.833496 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.833506 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.937073 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.937131 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.937144 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.937181 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.937217 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:32Z","lastTransitionTime":"2026-03-20T10:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.963291 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.965399 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.965771 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.970136 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerStarted","Data":"0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109"} Mar 20 10:56:32 crc kubenswrapper[4860]: I0320 10:56:32.982427 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.001531 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.014020 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.028540 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.040833 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.040889 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.040899 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.040917 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.040929 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.041651 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.054060 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.066102 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.090546 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.110225 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.126492 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.138029 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.143051 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.143088 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.143097 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.143117 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.143128 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.149708 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.161035 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.174753 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.187539 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.200137 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.211900 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.230286 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.245735 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.245887 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.245919 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.245928 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.245943 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.245955 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.264122 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.274629 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.284341 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.295137 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.309321 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.327366 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.344848 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.348723 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.348768 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.348781 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.348800 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.348814 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.358654 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.372869 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.412654 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:33 crc kubenswrapper[4860]: E0320 10:56:33.412807 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.451148 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.451218 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.451258 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.451280 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.451293 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.554623 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.554726 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.554740 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.555309 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.555373 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.659126 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.659188 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.659199 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.659226 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.659253 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.761782 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.761820 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.761830 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.761847 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.761858 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.866116 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.866164 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.866176 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.866203 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.866214 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.970096 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.970139 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.970149 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.970172 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.970184 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:33Z","lastTransitionTime":"2026-03-20T10:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.977085 4860 generic.go:334] "Generic (PLEG): container finished" podID="329ab518-a391-4483-8373-1329318b58da" containerID="0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109" exitCode=0 Mar 20 10:56:33 crc kubenswrapper[4860]: I0320 10:56:33.977145 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerDied","Data":"0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.016729 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.052642 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.069084 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.073480 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.073525 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.073536 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.073559 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.073572 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.082504 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.098385 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.114817 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.127586 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.139969 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.152064 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.169059 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.175829 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.175874 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.175887 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.175910 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.175923 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.185139 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.198760 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.212334 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.232067 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.278715 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.278756 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.278765 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.278781 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.278792 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.299807 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s"] Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.300310 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.302148 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.302729 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.313348 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.319556 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69a19086-4679-4d42-96b8-942e00d8491f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.319588 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69a19086-4679-4d42-96b8-942e00d8491f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.319672 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69a19086-4679-4d42-96b8-942e00d8491f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.319834 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrm4v\" (UniqueName: \"kubernetes.io/projected/69a19086-4679-4d42-96b8-942e00d8491f-kube-api-access-rrm4v\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.325396 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.335217 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.346814 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.360975 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.376028 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.381400 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.381425 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.381436 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.381451 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.381462 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.398391 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.412560 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.412684 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:34 crc kubenswrapper[4860]: E0320 10:56:34.412696 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:34 crc kubenswrapper[4860]: E0320 10:56:34.412867 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.415428 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.420166 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69a19086-4679-4d42-96b8-942e00d8491f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.420227 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrm4v\" (UniqueName: \"kubernetes.io/projected/69a19086-4679-4d42-96b8-942e00d8491f-kube-api-access-rrm4v\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.420274 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69a19086-4679-4d42-96b8-942e00d8491f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.420299 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69a19086-4679-4d42-96b8-942e00d8491f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.421158 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69a19086-4679-4d42-96b8-942e00d8491f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.421340 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69a19086-4679-4d42-96b8-942e00d8491f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.427919 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69a19086-4679-4d42-96b8-942e00d8491f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.428761 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.439097 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrm4v\" (UniqueName: \"kubernetes.io/projected/69a19086-4679-4d42-96b8-942e00d8491f-kube-api-access-rrm4v\") pod \"ovnkube-control-plane-749d76644c-ngb2s\" (UID: \"69a19086-4679-4d42-96b8-942e00d8491f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.441975 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.458656 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.475330 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.484004 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.484049 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.484060 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.484081 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.484094 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.496578 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.514131 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.528028 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.586951 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.587003 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.587014 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.587031 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.587044 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.615377 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" Mar 20 10:56:34 crc kubenswrapper[4860]: W0320 10:56:34.631305 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a19086_4679_4d42_96b8_942e00d8491f.slice/crio-689f3898c85bbd1330a122c8329ad41871a10f1259e1aa08a905153b898167a2 WatchSource:0}: Error finding container 689f3898c85bbd1330a122c8329ad41871a10f1259e1aa08a905153b898167a2: Status 404 returned error can't find the container with id 689f3898c85bbd1330a122c8329ad41871a10f1259e1aa08a905153b898167a2 Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.690353 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.690390 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.690401 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.690419 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.690431 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.793027 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.793075 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.793087 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.793110 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.793123 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.898492 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.898560 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.898587 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.898624 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.898641 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:34Z","lastTransitionTime":"2026-03-20T10:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:34 crc kubenswrapper[4860]: I0320 10:56:34.983348 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" event={"ID":"69a19086-4679-4d42-96b8-942e00d8491f","Type":"ContainerStarted","Data":"689f3898c85bbd1330a122c8329ad41871a10f1259e1aa08a905153b898167a2"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.001850 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.001904 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.001915 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.001935 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.001951 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.041578 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-q85gq"] Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.042325 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:35 crc kubenswrapper[4860]: E0320 10:56:35.042484 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.060904 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.076080 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.088352 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.102465 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.104075 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.104142 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.104153 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.104173 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.104185 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.122783 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.128435 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.128515 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlrn2\" (UniqueName: \"kubernetes.io/projected/035f0b3d-92ee-4564-8dad-28b231e1c800-kube-api-access-dlrn2\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.141394 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.164341 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.177897 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.192378 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.206477 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.207014 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.207044 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.207053 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.207070 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.207080 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.222323 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.229860 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.229957 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlrn2\" (UniqueName: \"kubernetes.io/projected/035f0b3d-92ee-4564-8dad-28b231e1c800-kube-api-access-dlrn2\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:35 crc kubenswrapper[4860]: E0320 10:56:35.230135 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:35 crc kubenswrapper[4860]: E0320 10:56:35.230318 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:35.730287059 +0000 UTC m=+119.951648027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.237542 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.248784 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlrn2\" (UniqueName: \"kubernetes.io/projected/035f0b3d-92ee-4564-8dad-28b231e1c800-kube-api-access-dlrn2\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.251516 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.267698 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.286057 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.310693 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.310760 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.310780 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.310809 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.310834 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.314285 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.412453 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:35 crc kubenswrapper[4860]: E0320 10:56:35.412593 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.414095 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.414139 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.414150 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.414172 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.414185 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.517481 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.517537 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.517550 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.517571 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.517584 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.620963 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.621049 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.621071 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.621107 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.621133 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.725127 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.725166 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.725175 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.725191 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.725201 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.735069 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:35 crc kubenswrapper[4860]: E0320 10:56:35.735221 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:35 crc kubenswrapper[4860]: E0320 10:56:35.735671 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:36.735651847 +0000 UTC m=+120.957012755 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.828195 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.828259 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.828274 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.828297 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.828313 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.932487 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.932565 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.932584 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.932608 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.932632 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.993125 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" event={"ID":"69a19086-4679-4d42-96b8-942e00d8491f","Type":"ContainerStarted","Data":"7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.993288 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" event={"ID":"69a19086-4679-4d42-96b8-942e00d8491f","Type":"ContainerStarted","Data":"029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e"} Mar 20 10:56:35 crc kubenswrapper[4860]: I0320 10:56:35.996896 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" event={"ID":"329ab518-a391-4483-8373-1329318b58da","Type":"ContainerStarted","Data":"8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.010806 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.027028 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.038568 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.038614 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.038630 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.038654 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.038667 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.045089 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.057967 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.078457 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.091504 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.110950 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.140538 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.140737 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:08.140699504 +0000 UTC m=+152.362060402 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.141319 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.142376 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.142414 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.142428 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.142452 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.142469 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.160575 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.174201 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.187029 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.201370 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.217048 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.233521 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.241447 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.241500 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.241526 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.241542 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241612 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241627 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241636 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241649 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241675 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:08.241659193 +0000 UTC m=+152.463020091 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241687 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:08.241682313 +0000 UTC m=+152.463043211 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241702 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241727 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:08.241718324 +0000 UTC m=+152.463079222 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241832 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241880 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241896 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.241988 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:08.241962391 +0000 UTC m=+152.463323459 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.244953 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.244984 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.245003 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.245021 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.245034 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.249177 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.261268 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.280675 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.305783 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.323014 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.337982 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.347901 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.347949 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.347963 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.347985 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.347999 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.352808 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.367700 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.383718 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.398782 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.410450 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.412643 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.412775 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.413109 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.413190 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.413310 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.413398 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.427853 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.441314 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.451539 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.451615 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.451635 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.451661 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.451679 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.459038 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.474996 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.486823 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.511962 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.526495 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.554016 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.554055 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.554068 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.554089 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.554102 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.647892 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.647959 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.647979 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.648023 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.648046 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.666661 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.671757 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.671815 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.671828 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.671849 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.671864 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.688542 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.694404 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.694477 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.694498 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.694529 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.694550 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.713012 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.718761 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.718826 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.718846 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.718870 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.718902 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.737409 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.743166 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.743235 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.743292 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.743328 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.743355 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.747824 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.747984 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.748033 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:38.748017017 +0000 UTC m=+122.969377915 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.760381 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:36 crc kubenswrapper[4860]: E0320 10:56:36.760503 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.762212 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.762263 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.762274 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.762290 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.762302 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.865942 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.865999 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.866010 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.866030 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.866042 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.969205 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.969283 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.969296 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.969316 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4860]: I0320 10:56:36.969329 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.004380 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/0.log" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.011385 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a" exitCode=1 Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.011429 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a"} Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.012179 4860 scope.go:117] "RemoveContainer" containerID="3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.027688 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.044985 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.071241 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI0320 10:56:35.889696 6663 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:56:35.890004 6663 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:56:35.890027 6663 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:56:35.890077 6663 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:35.890136 6663 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:35.890097 6663 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:35.890208 6663 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 10:56:35.890288 6663 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:35.890314 6663 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:35.890331 6663 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 10:56:35.890321 6663 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:35.890392 6663 factory.go:656] Stopping watch factory\\\\nI0320 10:56:35.890425 6663 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:35.890405 6663 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.072037 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.072089 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.072099 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.072119 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.072129 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:37Z","lastTransitionTime":"2026-03-20T10:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.086337 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.101392 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.116429 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.130168 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.141368 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.156408 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.170741 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.175407 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.175526 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.175596 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.175619 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.175649 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:37Z","lastTransitionTime":"2026-03-20T10:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.188741 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.210966 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.225955 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.240967 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.254345 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.268907 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: E0320 10:56:37.276960 4860 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.412680 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:37 crc kubenswrapper[4860]: E0320 10:56:37.412837 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.428098 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.443066 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.456581 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.472816 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.498654 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI0320 10:56:35.889696 6663 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:56:35.890004 6663 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:56:35.890027 6663 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:56:35.890077 6663 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:35.890136 6663 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:35.890097 6663 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:35.890208 6663 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 10:56:35.890288 6663 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:35.890314 6663 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:35.890331 6663 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 10:56:35.890321 6663 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:35.890392 6663 factory.go:656] Stopping watch factory\\\\nI0320 10:56:35.890425 6663 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:35.890405 6663 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.513627 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.532135 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.554266 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.579449 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: E0320 10:56:37.586663 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.601998 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.618048 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.633326 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.651166 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.666831 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.682028 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:37 crc kubenswrapper[4860]: I0320 10:56:37.695817 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.018100 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/0.log" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.023558 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77"} Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.024156 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.042896 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.055124 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.070920 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.084913 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.100656 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.115602 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.139076 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI0320 10:56:35.889696 6663 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:56:35.890004 6663 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:56:35.890027 6663 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:56:35.890077 6663 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:35.890136 6663 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:35.890097 6663 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:35.890208 6663 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 10:56:35.890288 6663 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:35.890314 6663 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:35.890331 6663 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 10:56:35.890321 6663 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:35.890392 6663 factory.go:656] Stopping watch factory\\\\nI0320 10:56:35.890425 6663 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:35.890405 6663 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.154780 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.168904 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.182567 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.198631 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.211142 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.225285 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.239600 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.258913 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.282056 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.412964 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.413003 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.412919 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:38 crc kubenswrapper[4860]: E0320 10:56:38.413157 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:38 crc kubenswrapper[4860]: E0320 10:56:38.413403 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:38 crc kubenswrapper[4860]: E0320 10:56:38.413676 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:38 crc kubenswrapper[4860]: I0320 10:56:38.768151 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:38 crc kubenswrapper[4860]: E0320 10:56:38.768456 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:38 crc kubenswrapper[4860]: E0320 10:56:38.768612 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:42.768581634 +0000 UTC m=+126.989942562 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.029216 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/1.log" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.030436 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/0.log" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.033799 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77" exitCode=1 Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.033846 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77"} Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.033890 4860 scope.go:117] "RemoveContainer" containerID="3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.035152 4860 scope.go:117] "RemoveContainer" containerID="355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77" Mar 20 10:56:39 crc kubenswrapper[4860]: E0320 10:56:39.035689 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.055120 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.084857 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c66567a97f03870425c40452e85d7f5d3d9692343d67c511060081eff71f24a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:36Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI0320 10:56:35.889696 6663 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:56:35.890004 6663 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:56:35.890027 6663 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:56:35.890077 6663 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:35.890136 6663 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:35.890097 6663 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:35.890208 6663 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 10:56:35.890288 6663 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:35.890314 6663 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:35.890331 6663 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 10:56:35.890321 6663 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:35.890392 6663 factory.go:656] Stopping watch factory\\\\nI0320 10:56:35.890425 6663 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:35.890405 6663 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"curred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 10:56:38.278374 6996 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:56:38.278405 6996 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-wpj5w\\\\nI0320 10:56:38.278416 6996 services_controller.go:452] B\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.107677 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.143929 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.166465 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.188451 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.201483 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.214534 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.230855 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.247256 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.269708 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.285301 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.299856 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.313248 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.328183 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.345277 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:39 crc kubenswrapper[4860]: I0320 10:56:39.413104 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:39 crc kubenswrapper[4860]: E0320 10:56:39.413301 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.040164 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/1.log" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.045483 4860 scope.go:117] "RemoveContainer" containerID="355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77" Mar 20 10:56:40 crc kubenswrapper[4860]: E0320 10:56:40.045786 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.065199 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.094501 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.116821 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.133004 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.149956 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.170259 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.191805 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.209030 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.223116 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.238094 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.251430 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.267452 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.282519 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.297191 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.313046 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.334473 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"curred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 10:56:38.278374 6996 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:56:38.278405 6996 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-wpj5w\\\\nI0320 10:56:38.278416 6996 services_controller.go:452] B\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.412535 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.412535 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:40 crc kubenswrapper[4860]: I0320 10:56:40.412541 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:40 crc kubenswrapper[4860]: E0320 10:56:40.412819 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:40 crc kubenswrapper[4860]: E0320 10:56:40.412666 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:40 crc kubenswrapper[4860]: E0320 10:56:40.412889 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:41 crc kubenswrapper[4860]: I0320 10:56:41.412827 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:41 crc kubenswrapper[4860]: E0320 10:56:41.413051 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.412507 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.412533 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.412533 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:42 crc kubenswrapper[4860]: E0320 10:56:42.413949 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:42 crc kubenswrapper[4860]: E0320 10:56:42.414252 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:42 crc kubenswrapper[4860]: E0320 10:56:42.414355 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:42 crc kubenswrapper[4860]: E0320 10:56:42.588080 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.643702 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.670541 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.693815 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.710809 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.729597 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.747052 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.767472 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.780665 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.792094 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.802929 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.814792 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:42 crc kubenswrapper[4860]: E0320 10:56:42.814985 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:42 crc kubenswrapper[4860]: E0320 10:56:42.815064 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:50.815048463 +0000 UTC m=+135.036409361 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.817007 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.830455 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.845458 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.859679 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.879371 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"curred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 10:56:38.278374 6996 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:56:38.278405 6996 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-wpj5w\\\\nI0320 10:56:38.278416 6996 services_controller.go:452] B\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.893928 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4860]: I0320 10:56:42.905794 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4860]: I0320 10:56:43.412829 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:43 crc kubenswrapper[4860]: E0320 10:56:43.413316 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:44 crc kubenswrapper[4860]: I0320 10:56:44.412473 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:44 crc kubenswrapper[4860]: I0320 10:56:44.412943 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:44 crc kubenswrapper[4860]: I0320 10:56:44.413255 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:44 crc kubenswrapper[4860]: E0320 10:56:44.413697 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:44 crc kubenswrapper[4860]: E0320 10:56:44.413905 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:44 crc kubenswrapper[4860]: E0320 10:56:44.413981 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:44 crc kubenswrapper[4860]: I0320 10:56:44.426669 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 10:56:45 crc kubenswrapper[4860]: I0320 10:56:45.412675 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:45 crc kubenswrapper[4860]: E0320 10:56:45.412902 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.412905 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.413002 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.412905 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:46 crc kubenswrapper[4860]: E0320 10:56:46.413124 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:46 crc kubenswrapper[4860]: E0320 10:56:46.413379 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:46 crc kubenswrapper[4860]: E0320 10:56:46.413558 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.956827 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.956872 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.956888 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.956904 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.956916 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:46 crc kubenswrapper[4860]: E0320 10:56:46.973903 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.977586 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.977617 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.977627 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.977645 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.977656 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:46 crc kubenswrapper[4860]: E0320 10:56:46.990898 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.995180 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.995251 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.995265 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.995283 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4860]: I0320 10:56:46.995297 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4860]: E0320 10:56:47.009545 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.013815 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.013855 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.013865 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.013882 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.013893 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:47Z","lastTransitionTime":"2026-03-20T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4860]: E0320 10:56:47.027413 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.032289 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.032352 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.032370 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.032409 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.032427 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:47Z","lastTransitionTime":"2026-03-20T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4860]: E0320 10:56:47.047139 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: E0320 10:56:47.047344 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.413286 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:47 crc kubenswrapper[4860]: E0320 10:56:47.413504 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.439422 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.456012 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.469697 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.483765 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.499863 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.518437 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.537524 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.554938 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.570393 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.583268 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: E0320 10:56:47.588770 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.598109 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.613343 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.627953 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.640979 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.664653 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"curred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 10:56:38.278374 6996 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:56:38.278405 6996 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-wpj5w\\\\nI0320 10:56:38.278416 6996 services_controller.go:452] B\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.681527 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:47 crc kubenswrapper[4860]: I0320 10:56:47.695055 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:47Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:48 crc kubenswrapper[4860]: I0320 10:56:48.412634 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:48 crc kubenswrapper[4860]: I0320 10:56:48.412749 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:48 crc kubenswrapper[4860]: E0320 10:56:48.412802 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:48 crc kubenswrapper[4860]: I0320 10:56:48.412891 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:48 crc kubenswrapper[4860]: E0320 10:56:48.413265 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:48 crc kubenswrapper[4860]: E0320 10:56:48.413472 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:49 crc kubenswrapper[4860]: I0320 10:56:49.413300 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:49 crc kubenswrapper[4860]: E0320 10:56:49.413516 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:50 crc kubenswrapper[4860]: I0320 10:56:50.413347 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:50 crc kubenswrapper[4860]: I0320 10:56:50.413388 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:50 crc kubenswrapper[4860]: I0320 10:56:50.413379 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:50 crc kubenswrapper[4860]: E0320 10:56:50.413507 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:50 crc kubenswrapper[4860]: E0320 10:56:50.413771 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:50 crc kubenswrapper[4860]: E0320 10:56:50.413877 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:50 crc kubenswrapper[4860]: I0320 10:56:50.909763 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:50 crc kubenswrapper[4860]: E0320 10:56:50.910023 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:50 crc kubenswrapper[4860]: E0320 10:56:50.910155 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:06.910128606 +0000 UTC m=+151.131489674 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:51 crc kubenswrapper[4860]: I0320 10:56:51.412361 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:51 crc kubenswrapper[4860]: E0320 10:56:51.412536 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:52 crc kubenswrapper[4860]: I0320 10:56:52.412787 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:52 crc kubenswrapper[4860]: E0320 10:56:52.413032 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:52 crc kubenswrapper[4860]: I0320 10:56:52.412799 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:52 crc kubenswrapper[4860]: I0320 10:56:52.414260 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:52 crc kubenswrapper[4860]: E0320 10:56:52.414305 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:52 crc kubenswrapper[4860]: E0320 10:56:52.414527 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:52 crc kubenswrapper[4860]: I0320 10:56:52.414683 4860 scope.go:117] "RemoveContainer" containerID="355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77" Mar 20 10:56:52 crc kubenswrapper[4860]: E0320 10:56:52.590740 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.092253 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/1.log" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.095501 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7"} Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.096090 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.108352 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.122663 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.135814 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.147964 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.162001 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.182959 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.209835 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.228403 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.244845 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.256473 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.270166 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.281215 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.293793 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.308327 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.339477 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"curred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 10:56:38.278374 6996 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:56:38.278405 6996 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-wpj5w\\\\nI0320 10:56:38.278416 6996 services_controller.go:452] B\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.356511 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.370488 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:53 crc kubenswrapper[4860]: I0320 10:56:53.413290 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:53 crc kubenswrapper[4860]: E0320 10:56:53.413451 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.101933 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/2.log" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.102917 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/1.log" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.107210 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7" exitCode=1 Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.107286 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7"} Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.107333 4860 scope.go:117] "RemoveContainer" containerID="355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.108071 4860 scope.go:117] "RemoveContainer" containerID="b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7" Mar 20 10:56:54 crc kubenswrapper[4860]: E0320 10:56:54.108279 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.130841 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.145395 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.157041 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.172250 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.193677 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.218415 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.231589 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.247304 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.261747 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.276050 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.302368 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.321744 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.341387 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.372284 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://355de5a5f850fa7e12c9ee5b4b152314e331aca100b142486cde31a6ca5b9f77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"curred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 10:56:38.278374 6996 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:56:38.278405 6996 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-wpj5w\\\\nI0320 10:56:38.278416 6996 services_controller.go:452] B\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.391197 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.406416 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.412855 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.412901 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.412855 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:54 crc kubenswrapper[4860]: E0320 10:56:54.413090 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:54 crc kubenswrapper[4860]: E0320 10:56:54.413170 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:54 crc kubenswrapper[4860]: E0320 10:56:54.413330 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:54 crc kubenswrapper[4860]: I0320 10:56:54.426177 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.112636 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/2.log" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.116488 4860 scope.go:117] "RemoveContainer" containerID="b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7" Mar 20 10:56:55 crc kubenswrapper[4860]: E0320 10:56:55.116814 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.140771 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.153893 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.165064 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.180410 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.204461 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.225044 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.241026 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.260275 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.278675 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.291715 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.306470 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.321716 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.336826 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.351688 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.382935 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.404453 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.412571 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:55 crc kubenswrapper[4860]: E0320 10:56:55.412718 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:55 crc kubenswrapper[4860]: I0320 10:56:55.421161 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:56 crc kubenswrapper[4860]: I0320 10:56:56.413087 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:56 crc kubenswrapper[4860]: I0320 10:56:56.413147 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:56 crc kubenswrapper[4860]: I0320 10:56:56.413110 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:56 crc kubenswrapper[4860]: E0320 10:56:56.413322 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:56 crc kubenswrapper[4860]: E0320 10:56:56.413502 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:56 crc kubenswrapper[4860]: E0320 10:56:56.413641 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.144282 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.144345 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.144358 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.144590 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.144681 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.167630 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.173356 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.173408 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.173423 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.173448 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.173463 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.193051 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.197847 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.197907 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.197919 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.197941 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.197954 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.209644 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.214687 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.214730 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.214743 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.214764 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.214780 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.227830 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.231876 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.231937 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.231948 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.231967 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.231982 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.246899 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.247092 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.412470 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.412608 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.425511 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.446813 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.462878 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.474741 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.489962 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.508125 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.531100 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.545966 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.560779 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.573141 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.588944 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: E0320 10:56:57.592095 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.611145 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.629631 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.644701 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.658663 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.685486 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:57 crc kubenswrapper[4860]: I0320 10:56:57.721834 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:57Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:58 crc kubenswrapper[4860]: I0320 10:56:58.412739 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:58 crc kubenswrapper[4860]: I0320 10:56:58.412776 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:58 crc kubenswrapper[4860]: I0320 10:56:58.412843 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:56:58 crc kubenswrapper[4860]: E0320 10:56:58.412953 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:58 crc kubenswrapper[4860]: E0320 10:56:58.413275 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:58 crc kubenswrapper[4860]: E0320 10:56:58.413714 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:56:59 crc kubenswrapper[4860]: I0320 10:56:59.412900 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:59 crc kubenswrapper[4860]: E0320 10:56:59.413171 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:00 crc kubenswrapper[4860]: I0320 10:57:00.412950 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:00 crc kubenswrapper[4860]: I0320 10:57:00.413152 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:00 crc kubenswrapper[4860]: I0320 10:57:00.413152 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:00 crc kubenswrapper[4860]: E0320 10:57:00.413846 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:00 crc kubenswrapper[4860]: E0320 10:57:00.414039 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:00 crc kubenswrapper[4860]: E0320 10:57:00.414294 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:01 crc kubenswrapper[4860]: I0320 10:57:01.413367 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:01 crc kubenswrapper[4860]: E0320 10:57:01.413628 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:02 crc kubenswrapper[4860]: I0320 10:57:02.412648 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:02 crc kubenswrapper[4860]: I0320 10:57:02.412698 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:02 crc kubenswrapper[4860]: E0320 10:57:02.412906 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:02 crc kubenswrapper[4860]: I0320 10:57:02.412686 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:02 crc kubenswrapper[4860]: E0320 10:57:02.413533 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:02 crc kubenswrapper[4860]: E0320 10:57:02.413625 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:02 crc kubenswrapper[4860]: E0320 10:57:02.594014 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:03 crc kubenswrapper[4860]: I0320 10:57:03.412857 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:03 crc kubenswrapper[4860]: E0320 10:57:03.413133 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:04 crc kubenswrapper[4860]: I0320 10:57:04.413108 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:04 crc kubenswrapper[4860]: E0320 10:57:04.413290 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:04 crc kubenswrapper[4860]: I0320 10:57:04.413737 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:04 crc kubenswrapper[4860]: E0320 10:57:04.413786 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:04 crc kubenswrapper[4860]: I0320 10:57:04.413824 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:04 crc kubenswrapper[4860]: E0320 10:57:04.413865 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:05 crc kubenswrapper[4860]: I0320 10:57:05.413163 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:05 crc kubenswrapper[4860]: E0320 10:57:05.413406 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:06 crc kubenswrapper[4860]: I0320 10:57:06.413174 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:06 crc kubenswrapper[4860]: I0320 10:57:06.413196 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:06 crc kubenswrapper[4860]: I0320 10:57:06.413286 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:06 crc kubenswrapper[4860]: E0320 10:57:06.413693 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:06 crc kubenswrapper[4860]: E0320 10:57:06.413860 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:06 crc kubenswrapper[4860]: E0320 10:57:06.413920 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.008550 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.008859 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.008985 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:39.008948025 +0000 UTC m=+183.230308963 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.412736 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.412965 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.430201 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.446991 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.466539 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.492187 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.514196 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.542647 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.564142 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.584437 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.594844 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.602793 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.622584 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.622634 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.622653 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.622672 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.622686 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.623578 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.636510 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.641038 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.641188 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.641335 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.641418 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.641481 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.641009 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.655945 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.657254 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.659570 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.659622 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.659633 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.659657 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.659670 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.672172 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.673984 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.679155 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.679192 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.679203 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.679238 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.679254 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.693432 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.697624 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.697653 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.697665 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.697688 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.697701 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.702145 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.709530 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: E0320 10:57:07.709678 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.717390 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.733951 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4860]: I0320 10:57:07.750655 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.220116 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.220305 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:12.220280009 +0000 UTC m=+216.441640907 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.322075 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.322149 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.322177 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.322205 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322379 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322409 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322455 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:12.322431324 +0000 UTC m=+216.543792242 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322533 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:12.322503856 +0000 UTC m=+216.543864764 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322675 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322691 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322705 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.322740 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:12.322730833 +0000 UTC m=+216.544091741 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.323070 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.323097 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.323109 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.323153 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:12.323139194 +0000 UTC m=+216.544500102 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.413287 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.413349 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.413703 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.414010 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:08 crc kubenswrapper[4860]: I0320 10:57:08.414193 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:08 crc kubenswrapper[4860]: E0320 10:57:08.415270 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:09 crc kubenswrapper[4860]: I0320 10:57:09.413593 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:09 crc kubenswrapper[4860]: E0320 10:57:09.414555 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:09 crc kubenswrapper[4860]: I0320 10:57:09.415287 4860 scope.go:117] "RemoveContainer" containerID="b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7" Mar 20 10:57:09 crc kubenswrapper[4860]: E0320 10:57:09.415745 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:57:10 crc kubenswrapper[4860]: I0320 10:57:10.413394 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:10 crc kubenswrapper[4860]: I0320 10:57:10.413452 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:10 crc kubenswrapper[4860]: E0320 10:57:10.413560 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:10 crc kubenswrapper[4860]: E0320 10:57:10.413735 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:10 crc kubenswrapper[4860]: I0320 10:57:10.413772 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:10 crc kubenswrapper[4860]: E0320 10:57:10.413951 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:11 crc kubenswrapper[4860]: I0320 10:57:11.413312 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:11 crc kubenswrapper[4860]: E0320 10:57:11.413596 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.189016 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/0.log" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.189123 4860 generic.go:334] "Generic (PLEG): container finished" podID="a89c8af2-338f-401f-aad5-c6d7763a3b3a" containerID="b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311" exitCode=1 Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.189171 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerDied","Data":"b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311"} Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.189779 4860 scope.go:117] "RemoveContainer" containerID="b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.210733 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.250117 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.268806 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.284610 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.299185 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.314260 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:11Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962\\\\n2026-03-20T10:56:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962 to /host/opt/cni/bin/\\\\n2026-03-20T10:56:26Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:26Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.331909 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.347253 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.364930 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.380345 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.392740 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.405300 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.412728 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:12 crc kubenswrapper[4860]: E0320 10:57:12.412874 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.413076 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:12 crc kubenswrapper[4860]: E0320 10:57:12.413152 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.413333 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:12 crc kubenswrapper[4860]: E0320 10:57:12.413425 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.419012 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.432650 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.446185 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.459781 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: I0320 10:57:12.482707 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:12 crc kubenswrapper[4860]: E0320 10:57:12.596726 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.197053 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/0.log" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.197151 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerStarted","Data":"e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e"} Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.215770 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.237622 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.258406 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:11Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962\\\\n2026-03-20T10:56:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962 to /host/opt/cni/bin/\\\\n2026-03-20T10:56:26Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:26Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.283086 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.311948 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.335668 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.357640 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.374048 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.392174 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.409504 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.412888 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:13 crc kubenswrapper[4860]: E0320 10:57:13.413042 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.432103 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.449011 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.475243 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.496371 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.512707 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.529071 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:13 crc kubenswrapper[4860]: I0320 10:57:13.544592 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4860]: I0320 10:57:14.412941 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:14 crc kubenswrapper[4860]: I0320 10:57:14.412991 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:14 crc kubenswrapper[4860]: I0320 10:57:14.413029 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:14 crc kubenswrapper[4860]: E0320 10:57:14.413165 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:14 crc kubenswrapper[4860]: E0320 10:57:14.413454 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:14 crc kubenswrapper[4860]: E0320 10:57:14.413487 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:15 crc kubenswrapper[4860]: I0320 10:57:15.413419 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:15 crc kubenswrapper[4860]: E0320 10:57:15.413644 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:15 crc kubenswrapper[4860]: I0320 10:57:15.425432 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 10:57:16 crc kubenswrapper[4860]: I0320 10:57:16.412743 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:16 crc kubenswrapper[4860]: I0320 10:57:16.412822 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:16 crc kubenswrapper[4860]: E0320 10:57:16.413540 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:16 crc kubenswrapper[4860]: I0320 10:57:16.412916 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:16 crc kubenswrapper[4860]: E0320 10:57:16.413853 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:16 crc kubenswrapper[4860]: E0320 10:57:16.414077 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.413215 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:17 crc kubenswrapper[4860]: E0320 10:57:17.413543 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.437265 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:11Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962\\\\n2026-03-20T10:56:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962 to /host/opt/cni/bin/\\\\n2026-03-20T10:56:26Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:26Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.464893 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.482734 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d35cb5-8c91-463a-a966-d34faa7a97c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9696d34484c79734fbd6e1a40f5e4ce6a680b0a67c52cb58ff7a1a1feb8390ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.523504 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.541866 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.560127 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.578680 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.591386 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: E0320 10:57:17.597946 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.609770 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.628712 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.643558 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.656902 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.677291 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.691124 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.704399 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.719983 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.740292 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.758572 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.889141 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.889215 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.889275 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.889339 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.889359 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:17Z","lastTransitionTime":"2026-03-20T10:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:17 crc kubenswrapper[4860]: E0320 10:57:17.910717 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.915933 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.915971 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.915984 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.916002 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.916014 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:17Z","lastTransitionTime":"2026-03-20T10:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:17 crc kubenswrapper[4860]: E0320 10:57:17.938417 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.942809 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.942837 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.942853 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.942873 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.942885 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:17Z","lastTransitionTime":"2026-03-20T10:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:17 crc kubenswrapper[4860]: E0320 10:57:17.957655 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.963895 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.963935 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.963946 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.963967 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.963980 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:17Z","lastTransitionTime":"2026-03-20T10:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:17 crc kubenswrapper[4860]: E0320 10:57:17.986822 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.992694 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.992750 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.992763 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.992788 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:17 crc kubenswrapper[4860]: I0320 10:57:17.992802 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:17Z","lastTransitionTime":"2026-03-20T10:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:18 crc kubenswrapper[4860]: E0320 10:57:18.007633 4860 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21bb8ef-2c26-4952-9b24-e8f54bfb6e63\\\",\\\"systemUUID\\\":\\\"5064a76f-5382-46f7-bae1-fe91bc80db78\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:18 crc kubenswrapper[4860]: E0320 10:57:18.007802 4860 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:57:18 crc kubenswrapper[4860]: I0320 10:57:18.412802 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:18 crc kubenswrapper[4860]: I0320 10:57:18.413568 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:18 crc kubenswrapper[4860]: E0320 10:57:18.413637 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:18 crc kubenswrapper[4860]: I0320 10:57:18.413675 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:18 crc kubenswrapper[4860]: E0320 10:57:18.413785 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:18 crc kubenswrapper[4860]: E0320 10:57:18.413951 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:19 crc kubenswrapper[4860]: I0320 10:57:19.413735 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:19 crc kubenswrapper[4860]: E0320 10:57:19.413883 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:20 crc kubenswrapper[4860]: I0320 10:57:20.412787 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:20 crc kubenswrapper[4860]: E0320 10:57:20.413107 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:20 crc kubenswrapper[4860]: I0320 10:57:20.413166 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:20 crc kubenswrapper[4860]: I0320 10:57:20.413214 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:20 crc kubenswrapper[4860]: E0320 10:57:20.413403 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:20 crc kubenswrapper[4860]: E0320 10:57:20.413610 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:21 crc kubenswrapper[4860]: I0320 10:57:21.413371 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:21 crc kubenswrapper[4860]: E0320 10:57:21.413533 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:22 crc kubenswrapper[4860]: I0320 10:57:22.413368 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:22 crc kubenswrapper[4860]: I0320 10:57:22.413473 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:22 crc kubenswrapper[4860]: E0320 10:57:22.413535 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:22 crc kubenswrapper[4860]: I0320 10:57:22.413565 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:22 crc kubenswrapper[4860]: E0320 10:57:22.413664 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:22 crc kubenswrapper[4860]: E0320 10:57:22.413797 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:22 crc kubenswrapper[4860]: I0320 10:57:22.414953 4860 scope.go:117] "RemoveContainer" containerID="b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7" Mar 20 10:57:22 crc kubenswrapper[4860]: E0320 10:57:22.599615 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.249029 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/2.log" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.253656 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.254214 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.273948 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69a19086-4679-4d42-96b8-942e00d8491f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://029dd44d42d236df04007ba436ae1800ebdf356c32c3506f621c55418b50526e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a83599b2c542987a93965e3fd026abc1eccd07fb78dc6ad777b03821eb4ed59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rrm4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ngb2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.296765 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.311216 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.323765 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.335438 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.348855 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:11Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962\\\\n2026-03-20T10:56:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962 to /host/opt/cni/bin/\\\\n2026-03-20T10:56:26Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:26Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.362986 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.372869 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d35cb5-8c91-463a-a966-d34faa7a97c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9696d34484c79734fbd6e1a40f5e4ce6a680b0a67c52cb58ff7a1a1feb8390ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.399207 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.412914 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.413028 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: E0320 10:57:23.413204 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.427152 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.430068 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.440173 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.454036 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.466428 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.480302 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8418cecd1d1147d2a8809df468c4c4f0b9945ccdf854566a8dca5fc4d4b822c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd0e1dd6f7812bb150627bbd3dfe0d5aba7c3403fd07fdec9367d0240c0c6f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.494473 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.512119 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:53Z\\\",\\\"message\\\":\\\"handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:53.462992 7196 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:56:53.463000 7196 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:56:53.463008 7196 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:53.463019 7196 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:53.463348 7196 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463395 7196 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:53.463446 7196 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:56:53.463458 7196 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:56:53.463471 7196 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:53.463476 7196 factory.go:656] Stopping watch factory\\\\nI0320 10:56:53.463506 7196 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:53.463520 7196 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:53.463508 7196 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:53.463560 7196 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:56:53.463584 7196 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:53.463671 7196 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9btp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nbkmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:23 crc kubenswrapper[4860]: I0320 10:57:23.526915 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.259777 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/3.log" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.260768 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/2.log" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.264330 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" exitCode=1 Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.264407 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.264466 4860 scope.go:117] "RemoveContainer" containerID="b952609a27351fa4ebbd7a4f68307aee93a3505e07bb3cb86439e4d963d26cd7" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.265159 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 10:57:24 crc kubenswrapper[4860]: E0320 10:57:24.265609 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.283915 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d35cb5-8c91-463a-a966-d34faa7a97c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9696d34484c79734fbd6e1a40f5e4ce6a680b0a67c52cb58ff7a1a1feb8390ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://229f12dfeed51b7ac52ddcc0137be0fa53ddc12d3969ca2c10cdf6d6ac80932d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.310040 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12fec5ee-a702-4f8f-93d6-9e7e36e42c76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18b7693932893caf09d1c42ddebf41401f321bbe619c7f163efb0da2f47f2444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b067a680ec99f6f7e89579c3da7b34c98f009b6b63541fd6707b5e27d0aafc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://277354cde9e0788801a3f735b4f2a233eeb78b8b930da73eb08befac3ff71c78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b10fbf2898d7d2de70d97c8e52829aad99b9085918cdd461a117395fad92aa6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46c1b956c902a5d92870e97c2e34a8a31b0501cc402d19a877550391ebc3934\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3a947a000a6e920853f20e62105554aaf994c6ef52dc20ec13ab180c6739a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://023b27b28feb92638d0df225125af8f3b318c9f2088cd345a58f2a7aa4bb9e69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043de037d6efd6c01007d995becebc0093bc5e21dfec3183f0160299bc7cf691\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.329468 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.342890 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-srbpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e597b5-a377-4988-8c59-eeace5ffa4e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://299876d90de1dc71dcbd805a769b991971944194b57cda3dff90392f35318dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml6nd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-srbpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.357336 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9df230-75a1-4b64-8d00-c179e9c19080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9ba3ecd4dbf40f800c11196492112694eb2fe160969cbd5f0b6fa7335552777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vhhqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.374755 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmc44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c8af2-338f-401f-aad5-c6d7763a3b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:11Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962\\\\n2026-03-20T10:56:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e1884f66-2279-4212-9c83-f9ac441fb962 to /host/opt/cni/bin/\\\\n2026-03-20T10:56:26Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:26Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w22pp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmc44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.399982 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"329ab518-a391-4483-8373-1329318b58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c4df2ec9fe14b2447ca7aa8d5d033c15f8ac4a7ce1cd4cb8430596293d9b9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff36e6e6b51f67caa8acd6b115968ea600d6618906a0172bda918ce7c5fade6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b564cf8b3d26fe5a6126f072c04b6e6e32bdc1c297669206f6febdc3280faa3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef2c4b271764c5d2b41838248dee886ffc52676dc0a333f8ba81831f065b4e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79e2d106005d9ac135bb8fb0206a8a1e9cafa686df7fa77a3ffb7e850f337e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44b3cb30bc75694b811b046288cb1b6498390ed237ea62e1a2a9ba8a17718ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca820052dd43e300943f400379bdaecd0db1d7a2ef5bf89b3f89bf24bd85109\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-242fz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wpj5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.413115 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.413138 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:24 crc kubenswrapper[4860]: E0320 10:57:24.413299 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.413442 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:24 crc kubenswrapper[4860]: E0320 10:57:24.413599 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:24 crc kubenswrapper[4860]: E0320 10:57:24.413949 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.432418 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8e33372-ff64-42f9-a28d-a1a292559759\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf2624c09c9ed0c88340cb5c33a9f304b84e7f10b768178bc03980768edd770b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82f4a0bcf048582263f9ba78f91872aba9de0ba2e3cce65a31e587c04a849fbc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 10:55:10.245671 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 10:55:10.247467 1 observer_polling.go:159] Starting file observer\\\\nI0320 10:55:10.249168 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 10:55:10.250405 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 10:55:39.912190 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 10:55:39.912319 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf4a38879e8e3c687bd3c57a3c68a29ad9a9e609ea0cbd220493b6ee4e7d9a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edc7f588e4fdfa1c92fcf94e685925ef7708d48f6dc4a72363331f66f0b4ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b411b2d78ae0ca6e465eafe2ca565d78630979ffc93ff9fb0785c70d42e4c447\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.448817 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e92d2fc-00a4-4ad4-873d-d8d49b45c703\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9917a32ef81256fec12a0a0679be3cf3a2e1f5dab824c4c23c2cf252433a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b968284eb846046e80483bafc625da31241ff1781ebd5216342bc2538064d124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acfeabcb583a79ac597fdc7d5ea5a7e28192a337743c9ac0d585d931e1a4406c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e74e43d4563dfd8a16d0e44d9e42cbad850bf1fe0f96f4cf7d1699ea3b0beec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.465554 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0072e277f5b3c9ee56fe99cdbedaeb8cc32dc972910fd660f64488e7c3b07c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.477115 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f54b30bd98921f791baccc4be68fa945ba2d9ca9bd415e32359d70d15d80410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.492191 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tggrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d9c3a3-4ed8-43ec-bb4a-fc1d49784105\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b2ebec10e00fab3194ba3501ab22f0d9381d9e75d28690a6ee5723da97fd226\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rv72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tggrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.513506 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-q85gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"035f0b3d-92ee-4564-8dad-28b231e1c800\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlrn2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-q85gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4860]: I0320 10:57:24.527803 4860 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26618d38-6c86-4f4d-84d0-33bd5a64ca4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:55:50.057036 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:55:50.057194 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:55:50.058157 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3809542762/tls.crt::/tmp/serving-cert-3809542762/tls.key\\\\\\\"\\\\nI0320 10:55:50.660894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:55:50.663707 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:55:50.663726 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:55:50.663750 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:55:50.663756 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:55:50.668753 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:55:50.669001 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669008 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:55:50.669014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:55:50.669019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:55:50.669023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0320 10:55:50.668830 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 10:55:50.669070 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 10:55:50.669835 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:54:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:54:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.271309 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/3.log" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.277811 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 10:57:25 crc kubenswrapper[4860]: E0320 10:57:25.278043 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.299183 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ngb2s" podStartSLOduration=96.299142452 podStartE2EDuration="1m36.299142452s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:24.629626306 +0000 UTC m=+168.850987204" watchObservedRunningTime="2026-03-20 10:57:25.299142452 +0000 UTC m=+169.520503390" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.322537 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-srbpg" podStartSLOduration=97.322498143 podStartE2EDuration="1m37.322498143s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.321686401 +0000 UTC m=+169.543047379" watchObservedRunningTime="2026-03-20 10:57:25.322498143 +0000 UTC m=+169.543859081" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.337514 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podStartSLOduration=97.337476875 podStartE2EDuration="1m37.337476875s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.337206197 +0000 UTC m=+169.558567135" watchObservedRunningTime="2026-03-20 10:57:25.337476875 +0000 UTC m=+169.558837783" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.356294 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cmc44" podStartSLOduration=97.356268681 podStartE2EDuration="1m37.356268681s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.356075715 +0000 UTC m=+169.577436653" watchObservedRunningTime="2026-03-20 10:57:25.356268681 +0000 UTC m=+169.577629579" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.391406 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wpj5w" podStartSLOduration=97.391376225 podStartE2EDuration="1m37.391376225s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.391104177 +0000 UTC m=+169.612465115" watchObservedRunningTime="2026-03-20 10:57:25.391376225 +0000 UTC m=+169.612737143" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.409666 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=10.409642496 podStartE2EDuration="10.409642496s" podCreationTimestamp="2026-03-20 10:57:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.408917997 +0000 UTC m=+169.630278915" watchObservedRunningTime="2026-03-20 10:57:25.409642496 +0000 UTC m=+169.631003404" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.413384 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:25 crc kubenswrapper[4860]: E0320 10:57:25.413582 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.448957 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=74.448929425 podStartE2EDuration="1m14.448929425s" podCreationTimestamp="2026-03-20 10:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.448490983 +0000 UTC m=+169.669851901" watchObservedRunningTime="2026-03-20 10:57:25.448929425 +0000 UTC m=+169.670290333" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.498594 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tggrc" podStartSLOduration=97.498567698 podStartE2EDuration="1m37.498567698s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.486856177 +0000 UTC m=+169.708217065" watchObservedRunningTime="2026-03-20 10:57:25.498567698 +0000 UTC m=+169.719928596" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.527930 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=41.527887634 podStartE2EDuration="41.527887634s" podCreationTimestamp="2026-03-20 10:56:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.526991879 +0000 UTC m=+169.748352787" watchObservedRunningTime="2026-03-20 10:57:25.527887634 +0000 UTC m=+169.749248532" Mar 20 10:57:25 crc kubenswrapper[4860]: I0320 10:57:25.528264 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=2.528255954 podStartE2EDuration="2.528255954s" podCreationTimestamp="2026-03-20 10:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.512266525 +0000 UTC m=+169.733627423" watchObservedRunningTime="2026-03-20 10:57:25.528255954 +0000 UTC m=+169.749616852" Mar 20 10:57:26 crc kubenswrapper[4860]: I0320 10:57:26.412967 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:26 crc kubenswrapper[4860]: I0320 10:57:26.413119 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:26 crc kubenswrapper[4860]: I0320 10:57:26.413004 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:26 crc kubenswrapper[4860]: E0320 10:57:26.413294 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:26 crc kubenswrapper[4860]: E0320 10:57:26.413361 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:26 crc kubenswrapper[4860]: E0320 10:57:26.413509 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:27 crc kubenswrapper[4860]: I0320 10:57:27.413021 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:27 crc kubenswrapper[4860]: E0320 10:57:27.414836 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:27 crc kubenswrapper[4860]: E0320 10:57:27.600319 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.036727 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.036787 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.036805 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.036829 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.036848 4860 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:28Z","lastTransitionTime":"2026-03-20T10:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.096185 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=81.096144783 podStartE2EDuration="1m21.096144783s" podCreationTimestamp="2026-03-20 10:56:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.624336752 +0000 UTC m=+169.845697660" watchObservedRunningTime="2026-03-20 10:57:28.096144783 +0000 UTC m=+172.317505701" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.098785 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8"] Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.099203 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.101914 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.101996 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.101923 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.102463 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.178524 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc11373f-096b-4cc4-810b-f702f819da6c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.178637 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc11373f-096b-4cc4-810b-f702f819da6c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.178750 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dc11373f-096b-4cc4-810b-f702f819da6c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.178807 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dc11373f-096b-4cc4-810b-f702f819da6c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.178861 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc11373f-096b-4cc4-810b-f702f819da6c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.280546 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc11373f-096b-4cc4-810b-f702f819da6c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.281103 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc11373f-096b-4cc4-810b-f702f819da6c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.281456 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dc11373f-096b-4cc4-810b-f702f819da6c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.281726 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc11373f-096b-4cc4-810b-f702f819da6c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.282004 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dc11373f-096b-4cc4-810b-f702f819da6c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.281620 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dc11373f-096b-4cc4-810b-f702f819da6c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.282137 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dc11373f-096b-4cc4-810b-f702f819da6c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.283182 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc11373f-096b-4cc4-810b-f702f819da6c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.293018 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc11373f-096b-4cc4-810b-f702f819da6c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.305795 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc11373f-096b-4cc4-810b-f702f819da6c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sqdf8\" (UID: \"dc11373f-096b-4cc4-810b-f702f819da6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.399784 4860 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.411964 4860 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.412896 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.413040 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.413086 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:28 crc kubenswrapper[4860]: E0320 10:57:28.413415 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:28 crc kubenswrapper[4860]: E0320 10:57:28.414022 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:28 crc kubenswrapper[4860]: E0320 10:57:28.414267 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:28 crc kubenswrapper[4860]: I0320 10:57:28.420123 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" Mar 20 10:57:28 crc kubenswrapper[4860]: W0320 10:57:28.437376 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc11373f_096b_4cc4_810b_f702f819da6c.slice/crio-02fa7a35949f9d298d7e11b46e6107bd7cacd4969f37b9b9b0d7922a0bee4e27 WatchSource:0}: Error finding container 02fa7a35949f9d298d7e11b46e6107bd7cacd4969f37b9b9b0d7922a0bee4e27: Status 404 returned error can't find the container with id 02fa7a35949f9d298d7e11b46e6107bd7cacd4969f37b9b9b0d7922a0bee4e27 Mar 20 10:57:29 crc kubenswrapper[4860]: I0320 10:57:29.293736 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" event={"ID":"dc11373f-096b-4cc4-810b-f702f819da6c","Type":"ContainerStarted","Data":"d635ee52acf225fb2d2f9567d5edbb00d2b2029902ed3a38b2251701a9c0be82"} Mar 20 10:57:29 crc kubenswrapper[4860]: I0320 10:57:29.293808 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" event={"ID":"dc11373f-096b-4cc4-810b-f702f819da6c","Type":"ContainerStarted","Data":"02fa7a35949f9d298d7e11b46e6107bd7cacd4969f37b9b9b0d7922a0bee4e27"} Mar 20 10:57:29 crc kubenswrapper[4860]: I0320 10:57:29.315408 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sqdf8" podStartSLOduration=101.315385745 podStartE2EDuration="1m41.315385745s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:29.314613054 +0000 UTC m=+173.535973972" watchObservedRunningTime="2026-03-20 10:57:29.315385745 +0000 UTC m=+173.536746653" Mar 20 10:57:29 crc kubenswrapper[4860]: I0320 10:57:29.413311 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:29 crc kubenswrapper[4860]: E0320 10:57:29.413476 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:30 crc kubenswrapper[4860]: I0320 10:57:30.413140 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:30 crc kubenswrapper[4860]: I0320 10:57:30.413307 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:30 crc kubenswrapper[4860]: I0320 10:57:30.413381 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:30 crc kubenswrapper[4860]: E0320 10:57:30.413327 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:30 crc kubenswrapper[4860]: E0320 10:57:30.413515 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:30 crc kubenswrapper[4860]: E0320 10:57:30.413708 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:31 crc kubenswrapper[4860]: I0320 10:57:31.413335 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:31 crc kubenswrapper[4860]: E0320 10:57:31.414098 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:32 crc kubenswrapper[4860]: I0320 10:57:32.412417 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:32 crc kubenswrapper[4860]: I0320 10:57:32.412417 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:32 crc kubenswrapper[4860]: I0320 10:57:32.412447 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:32 crc kubenswrapper[4860]: E0320 10:57:32.413007 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:32 crc kubenswrapper[4860]: E0320 10:57:32.413354 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:32 crc kubenswrapper[4860]: E0320 10:57:32.413773 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:32 crc kubenswrapper[4860]: E0320 10:57:32.601604 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:33 crc kubenswrapper[4860]: I0320 10:57:33.412689 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:33 crc kubenswrapper[4860]: E0320 10:57:33.413196 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:34 crc kubenswrapper[4860]: I0320 10:57:34.414524 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:34 crc kubenswrapper[4860]: I0320 10:57:34.414565 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:34 crc kubenswrapper[4860]: I0320 10:57:34.414524 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:34 crc kubenswrapper[4860]: E0320 10:57:34.414715 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:34 crc kubenswrapper[4860]: E0320 10:57:34.414806 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:34 crc kubenswrapper[4860]: E0320 10:57:34.414880 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:35 crc kubenswrapper[4860]: I0320 10:57:35.413200 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:35 crc kubenswrapper[4860]: E0320 10:57:35.413460 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:36 crc kubenswrapper[4860]: I0320 10:57:36.412661 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:36 crc kubenswrapper[4860]: I0320 10:57:36.412780 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:36 crc kubenswrapper[4860]: E0320 10:57:36.412813 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:36 crc kubenswrapper[4860]: I0320 10:57:36.412916 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:36 crc kubenswrapper[4860]: E0320 10:57:36.412969 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:36 crc kubenswrapper[4860]: E0320 10:57:36.413126 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:37 crc kubenswrapper[4860]: I0320 10:57:37.413172 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:37 crc kubenswrapper[4860]: E0320 10:57:37.413977 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:37 crc kubenswrapper[4860]: E0320 10:57:37.602308 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:38 crc kubenswrapper[4860]: I0320 10:57:38.413297 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:38 crc kubenswrapper[4860]: I0320 10:57:38.413401 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:38 crc kubenswrapper[4860]: I0320 10:57:38.413335 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:38 crc kubenswrapper[4860]: E0320 10:57:38.413543 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:38 crc kubenswrapper[4860]: E0320 10:57:38.413674 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:38 crc kubenswrapper[4860]: E0320 10:57:38.413812 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:39 crc kubenswrapper[4860]: I0320 10:57:39.109723 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:39 crc kubenswrapper[4860]: E0320 10:57:39.110016 4860 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:39 crc kubenswrapper[4860]: E0320 10:57:39.110155 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs podName:035f0b3d-92ee-4564-8dad-28b231e1c800 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.110122105 +0000 UTC m=+247.331483013 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs") pod "network-metrics-daemon-q85gq" (UID: "035f0b3d-92ee-4564-8dad-28b231e1c800") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:39 crc kubenswrapper[4860]: I0320 10:57:39.413047 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:39 crc kubenswrapper[4860]: E0320 10:57:39.413367 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:40 crc kubenswrapper[4860]: I0320 10:57:40.412745 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:40 crc kubenswrapper[4860]: I0320 10:57:40.412823 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:40 crc kubenswrapper[4860]: I0320 10:57:40.412901 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:40 crc kubenswrapper[4860]: E0320 10:57:40.412896 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:40 crc kubenswrapper[4860]: E0320 10:57:40.412955 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:40 crc kubenswrapper[4860]: E0320 10:57:40.413353 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:40 crc kubenswrapper[4860]: I0320 10:57:40.413727 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 10:57:40 crc kubenswrapper[4860]: E0320 10:57:40.414078 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:57:41 crc kubenswrapper[4860]: I0320 10:57:41.412980 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:41 crc kubenswrapper[4860]: E0320 10:57:41.413378 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:42 crc kubenswrapper[4860]: I0320 10:57:42.412449 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:42 crc kubenswrapper[4860]: I0320 10:57:42.412449 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:42 crc kubenswrapper[4860]: I0320 10:57:42.412580 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:42 crc kubenswrapper[4860]: E0320 10:57:42.412705 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:42 crc kubenswrapper[4860]: E0320 10:57:42.412886 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:42 crc kubenswrapper[4860]: E0320 10:57:42.412914 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:42 crc kubenswrapper[4860]: E0320 10:57:42.603547 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:43 crc kubenswrapper[4860]: I0320 10:57:43.412469 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:43 crc kubenswrapper[4860]: E0320 10:57:43.413002 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:44 crc kubenswrapper[4860]: I0320 10:57:44.412479 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:44 crc kubenswrapper[4860]: I0320 10:57:44.412618 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:44 crc kubenswrapper[4860]: E0320 10:57:44.412755 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:44 crc kubenswrapper[4860]: E0320 10:57:44.413008 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:44 crc kubenswrapper[4860]: I0320 10:57:44.413197 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:44 crc kubenswrapper[4860]: E0320 10:57:44.413362 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:45 crc kubenswrapper[4860]: I0320 10:57:45.412904 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:45 crc kubenswrapper[4860]: E0320 10:57:45.413150 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:46 crc kubenswrapper[4860]: I0320 10:57:46.413189 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:46 crc kubenswrapper[4860]: I0320 10:57:46.413189 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:46 crc kubenswrapper[4860]: E0320 10:57:46.413460 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:46 crc kubenswrapper[4860]: I0320 10:57:46.413249 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:46 crc kubenswrapper[4860]: E0320 10:57:46.413624 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:46 crc kubenswrapper[4860]: E0320 10:57:46.413782 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:47 crc kubenswrapper[4860]: I0320 10:57:47.412725 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:47 crc kubenswrapper[4860]: E0320 10:57:47.413925 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:47 crc kubenswrapper[4860]: E0320 10:57:47.604248 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:48 crc kubenswrapper[4860]: I0320 10:57:48.413362 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:48 crc kubenswrapper[4860]: I0320 10:57:48.413362 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:48 crc kubenswrapper[4860]: E0320 10:57:48.413613 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:48 crc kubenswrapper[4860]: I0320 10:57:48.413395 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:48 crc kubenswrapper[4860]: E0320 10:57:48.413872 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:48 crc kubenswrapper[4860]: E0320 10:57:48.414092 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:49 crc kubenswrapper[4860]: I0320 10:57:49.412863 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:49 crc kubenswrapper[4860]: E0320 10:57:49.413447 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:50 crc kubenswrapper[4860]: I0320 10:57:50.413054 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:50 crc kubenswrapper[4860]: I0320 10:57:50.413141 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:50 crc kubenswrapper[4860]: I0320 10:57:50.413177 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:50 crc kubenswrapper[4860]: E0320 10:57:50.413244 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:50 crc kubenswrapper[4860]: E0320 10:57:50.413400 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:50 crc kubenswrapper[4860]: E0320 10:57:50.413514 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:51 crc kubenswrapper[4860]: I0320 10:57:51.412692 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:51 crc kubenswrapper[4860]: E0320 10:57:51.413081 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:52 crc kubenswrapper[4860]: I0320 10:57:52.413131 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:52 crc kubenswrapper[4860]: I0320 10:57:52.413175 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:52 crc kubenswrapper[4860]: I0320 10:57:52.413270 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:52 crc kubenswrapper[4860]: E0320 10:57:52.413428 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:52 crc kubenswrapper[4860]: E0320 10:57:52.413694 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:52 crc kubenswrapper[4860]: E0320 10:57:52.413876 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:52 crc kubenswrapper[4860]: E0320 10:57:52.606078 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:53 crc kubenswrapper[4860]: I0320 10:57:53.413512 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:53 crc kubenswrapper[4860]: E0320 10:57:53.414127 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:53 crc kubenswrapper[4860]: I0320 10:57:53.414543 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 10:57:53 crc kubenswrapper[4860]: E0320 10:57:53.414763 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nbkmw_openshift-ovn-kubernetes(eb85f6f9-1c0f-4388-9464-25dfe48d8d0f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" Mar 20 10:57:54 crc kubenswrapper[4860]: I0320 10:57:54.412382 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:54 crc kubenswrapper[4860]: I0320 10:57:54.412489 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:54 crc kubenswrapper[4860]: I0320 10:57:54.412410 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:54 crc kubenswrapper[4860]: E0320 10:57:54.412680 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:54 crc kubenswrapper[4860]: E0320 10:57:54.413071 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:54 crc kubenswrapper[4860]: E0320 10:57:54.413210 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:55 crc kubenswrapper[4860]: I0320 10:57:55.412962 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:55 crc kubenswrapper[4860]: E0320 10:57:55.413176 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:56 crc kubenswrapper[4860]: I0320 10:57:56.412622 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:56 crc kubenswrapper[4860]: I0320 10:57:56.412762 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:56 crc kubenswrapper[4860]: I0320 10:57:56.412615 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:56 crc kubenswrapper[4860]: E0320 10:57:56.412919 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:56 crc kubenswrapper[4860]: E0320 10:57:56.413160 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:56 crc kubenswrapper[4860]: E0320 10:57:56.413389 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:57 crc kubenswrapper[4860]: I0320 10:57:57.413017 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:57 crc kubenswrapper[4860]: E0320 10:57:57.415273 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:57 crc kubenswrapper[4860]: E0320 10:57:57.606968 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.413024 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.413024 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.413119 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:58 crc kubenswrapper[4860]: E0320 10:57:58.414206 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:57:58 crc kubenswrapper[4860]: E0320 10:57:58.414343 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:58 crc kubenswrapper[4860]: E0320 10:57:58.414613 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.415618 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/1.log" Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.416558 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/0.log" Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.416635 4860 generic.go:334] "Generic (PLEG): container finished" podID="a89c8af2-338f-401f-aad5-c6d7763a3b3a" containerID="e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e" exitCode=1 Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.416682 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerDied","Data":"e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e"} Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.416734 4860 scope.go:117] "RemoveContainer" containerID="b5709387395dd6da1b208c3254efbf5fe9e0f9927f1f97c3c72636b6525cd311" Mar 20 10:57:58 crc kubenswrapper[4860]: I0320 10:57:58.417435 4860 scope.go:117] "RemoveContainer" containerID="e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e" Mar 20 10:57:58 crc kubenswrapper[4860]: E0320 10:57:58.417803 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-cmc44_openshift-multus(a89c8af2-338f-401f-aad5-c6d7763a3b3a)\"" pod="openshift-multus/multus-cmc44" podUID="a89c8af2-338f-401f-aad5-c6d7763a3b3a" Mar 20 10:57:59 crc kubenswrapper[4860]: I0320 10:57:59.413178 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:59 crc kubenswrapper[4860]: E0320 10:57:59.413510 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:59 crc kubenswrapper[4860]: I0320 10:57:59.422062 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/1.log" Mar 20 10:58:00 crc kubenswrapper[4860]: I0320 10:58:00.412915 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:00 crc kubenswrapper[4860]: I0320 10:58:00.413030 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:00 crc kubenswrapper[4860]: I0320 10:58:00.412948 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:00 crc kubenswrapper[4860]: E0320 10:58:00.413114 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:00 crc kubenswrapper[4860]: E0320 10:58:00.413260 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:00 crc kubenswrapper[4860]: E0320 10:58:00.413586 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:01 crc kubenswrapper[4860]: I0320 10:58:01.413202 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:01 crc kubenswrapper[4860]: E0320 10:58:01.413414 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:02 crc kubenswrapper[4860]: I0320 10:58:02.413453 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:02 crc kubenswrapper[4860]: I0320 10:58:02.413554 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:02 crc kubenswrapper[4860]: I0320 10:58:02.413613 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:02 crc kubenswrapper[4860]: E0320 10:58:02.413732 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:02 crc kubenswrapper[4860]: E0320 10:58:02.413796 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:02 crc kubenswrapper[4860]: E0320 10:58:02.413911 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:02 crc kubenswrapper[4860]: E0320 10:58:02.608825 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:58:03 crc kubenswrapper[4860]: I0320 10:58:03.412843 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:03 crc kubenswrapper[4860]: E0320 10:58:03.413034 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:04 crc kubenswrapper[4860]: I0320 10:58:04.412644 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:04 crc kubenswrapper[4860]: I0320 10:58:04.412707 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:04 crc kubenswrapper[4860]: I0320 10:58:04.412712 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:04 crc kubenswrapper[4860]: E0320 10:58:04.412916 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:04 crc kubenswrapper[4860]: E0320 10:58:04.413692 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:04 crc kubenswrapper[4860]: E0320 10:58:04.413853 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:04 crc kubenswrapper[4860]: I0320 10:58:04.414334 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 10:58:05 crc kubenswrapper[4860]: I0320 10:58:05.344912 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q85gq"] Mar 20 10:58:05 crc kubenswrapper[4860]: I0320 10:58:05.345555 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:05 crc kubenswrapper[4860]: E0320 10:58:05.346011 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:05 crc kubenswrapper[4860]: I0320 10:58:05.413532 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:05 crc kubenswrapper[4860]: E0320 10:58:05.414459 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:05 crc kubenswrapper[4860]: I0320 10:58:05.445742 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/3.log" Mar 20 10:58:05 crc kubenswrapper[4860]: I0320 10:58:05.448132 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerStarted","Data":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} Mar 20 10:58:05 crc kubenswrapper[4860]: I0320 10:58:05.449744 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:58:06 crc kubenswrapper[4860]: I0320 10:58:06.412441 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:06 crc kubenswrapper[4860]: I0320 10:58:06.412595 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:06 crc kubenswrapper[4860]: E0320 10:58:06.413017 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:06 crc kubenswrapper[4860]: E0320 10:58:06.413128 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:07 crc kubenswrapper[4860]: I0320 10:58:07.413165 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:07 crc kubenswrapper[4860]: I0320 10:58:07.413204 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:07 crc kubenswrapper[4860]: E0320 10:58:07.414572 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:07 crc kubenswrapper[4860]: E0320 10:58:07.414832 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:07 crc kubenswrapper[4860]: E0320 10:58:07.609690 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:58:08 crc kubenswrapper[4860]: I0320 10:58:08.413169 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:08 crc kubenswrapper[4860]: E0320 10:58:08.413446 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:08 crc kubenswrapper[4860]: I0320 10:58:08.413529 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:08 crc kubenswrapper[4860]: E0320 10:58:08.413842 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:09 crc kubenswrapper[4860]: I0320 10:58:09.413066 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:09 crc kubenswrapper[4860]: E0320 10:58:09.413421 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:09 crc kubenswrapper[4860]: I0320 10:58:09.413828 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:09 crc kubenswrapper[4860]: E0320 10:58:09.413967 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:10 crc kubenswrapper[4860]: I0320 10:58:10.413077 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:10 crc kubenswrapper[4860]: I0320 10:58:10.413276 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:10 crc kubenswrapper[4860]: E0320 10:58:10.413301 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:10 crc kubenswrapper[4860]: E0320 10:58:10.413444 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:11 crc kubenswrapper[4860]: I0320 10:58:11.413540 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:11 crc kubenswrapper[4860]: I0320 10:58:11.413631 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:11 crc kubenswrapper[4860]: E0320 10:58:11.413697 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:11 crc kubenswrapper[4860]: E0320 10:58:11.413822 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:12 crc kubenswrapper[4860]: I0320 10:58:12.321501 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.321667 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 11:00:14.321632297 +0000 UTC m=+338.542993215 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:12 crc kubenswrapper[4860]: I0320 10:58:12.413062 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:12 crc kubenswrapper[4860]: I0320 10:58:12.413144 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.413338 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.413511 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:12 crc kubenswrapper[4860]: I0320 10:58:12.422912 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:12 crc kubenswrapper[4860]: I0320 10:58:12.422979 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:12 crc kubenswrapper[4860]: I0320 10:58:12.423028 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:12 crc kubenswrapper[4860]: I0320 10:58:12.423102 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423181 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423273 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423184 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423331 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423330 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423367 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423392 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423296 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423397 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 11:00:14.423372117 +0000 UTC m=+338.644733005 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423481 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 11:00:14.423454429 +0000 UTC m=+338.644815367 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423539 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 11:00:14.423524551 +0000 UTC m=+338.644885489 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.423564 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 11:00:14.423550562 +0000 UTC m=+338.644911510 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:58:12 crc kubenswrapper[4860]: E0320 10:58:12.611875 4860 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:58:13 crc kubenswrapper[4860]: I0320 10:58:13.412568 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:13 crc kubenswrapper[4860]: I0320 10:58:13.412604 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:13 crc kubenswrapper[4860]: E0320 10:58:13.412698 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:13 crc kubenswrapper[4860]: E0320 10:58:13.412818 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:13 crc kubenswrapper[4860]: I0320 10:58:13.412960 4860 scope.go:117] "RemoveContainer" containerID="e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e" Mar 20 10:58:13 crc kubenswrapper[4860]: I0320 10:58:13.436032 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podStartSLOduration=145.436009945 podStartE2EDuration="2m25.436009945s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:05.478536905 +0000 UTC m=+209.699897813" watchObservedRunningTime="2026-03-20 10:58:13.436009945 +0000 UTC m=+217.657370843" Mar 20 10:58:14 crc kubenswrapper[4860]: I0320 10:58:14.412985 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:14 crc kubenswrapper[4860]: E0320 10:58:14.413145 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:14 crc kubenswrapper[4860]: I0320 10:58:14.413349 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:14 crc kubenswrapper[4860]: E0320 10:58:14.413421 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:14 crc kubenswrapper[4860]: I0320 10:58:14.487906 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/1.log" Mar 20 10:58:14 crc kubenswrapper[4860]: I0320 10:58:14.487977 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerStarted","Data":"bbf0bd8dd1e8efce7a65cd6499f4e5d67e95f7c0af27658c16d6dad07affb764"} Mar 20 10:58:15 crc kubenswrapper[4860]: I0320 10:58:15.412929 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:15 crc kubenswrapper[4860]: I0320 10:58:15.412952 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:15 crc kubenswrapper[4860]: E0320 10:58:15.413084 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:15 crc kubenswrapper[4860]: E0320 10:58:15.413323 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:16 crc kubenswrapper[4860]: I0320 10:58:16.412622 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:16 crc kubenswrapper[4860]: E0320 10:58:16.412802 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:16 crc kubenswrapper[4860]: I0320 10:58:16.413017 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:16 crc kubenswrapper[4860]: E0320 10:58:16.413290 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:17 crc kubenswrapper[4860]: I0320 10:58:17.412633 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:17 crc kubenswrapper[4860]: I0320 10:58:17.412765 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:17 crc kubenswrapper[4860]: E0320 10:58:17.415194 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q85gq" podUID="035f0b3d-92ee-4564-8dad-28b231e1c800" Mar 20 10:58:17 crc kubenswrapper[4860]: E0320 10:58:17.415496 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:18 crc kubenswrapper[4860]: I0320 10:58:18.412429 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:18 crc kubenswrapper[4860]: I0320 10:58:18.412580 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:18 crc kubenswrapper[4860]: I0320 10:58:18.415126 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 10:58:18 crc kubenswrapper[4860]: I0320 10:58:18.415750 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 10:58:18 crc kubenswrapper[4860]: I0320 10:58:18.415913 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 10:58:18 crc kubenswrapper[4860]: I0320 10:58:18.416901 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.413120 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.413600 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.416491 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.417120 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.585813 4860 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.625203 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.625791 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.626518 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hfxcc"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.627191 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.629368 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7xnrh"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.629781 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.630572 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.631090 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.634871 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.635048 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.640761 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.641939 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.642446 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.642722 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.643041 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.643301 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.643983 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.644134 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.644297 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.644397 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.644494 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.646881 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.647094 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.647246 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.647393 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.647613 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.647760 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.647949 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.648128 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.648784 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.648940 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.649128 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.650825 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-srz5x"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.651420 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.651928 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.652687 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.655268 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6xl8q"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.656058 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.656465 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.656996 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.657570 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pc5tf"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.657953 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.658425 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.658917 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.659360 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.659936 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.661333 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vpp2k"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.662046 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.671298 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-s52jd"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.671961 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.681442 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.681980 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.682147 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.682518 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.688345 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.688703 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.689428 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.694741 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.708459 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.709102 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.709947 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.711378 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712199 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712486 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712625 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712731 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712813 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712893 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712926 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.713022 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.713080 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.713100 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.713238 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.713411 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.713554 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.712736 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.713654 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.714043 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.714136 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.714242 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.714352 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.714565 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.714741 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.714888 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.715076 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.715167 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.715255 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.715696 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719605 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-client-ca\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719644 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba450d5b-c962-4788-8215-d1eb12f9b314-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719667 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6c5c\" (UniqueName: \"kubernetes.io/projected/a58a54d3-d454-4503-8b70-0e78784efdfc-kube-api-access-v6c5c\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719689 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-client\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719709 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-config\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719730 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcm82\" (UniqueName: \"kubernetes.io/projected/ba450d5b-c962-4788-8215-d1eb12f9b314-kube-api-access-vcm82\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719749 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-service-ca\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719780 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719799 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfa87f04-40c6-4575-b647-fb13a115b81d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719816 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfa87f04-40c6-4575-b647-fb13a115b81d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719835 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a58a54d3-d454-4503-8b70-0e78784efdfc-serving-cert\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719855 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7g2j\" (UniqueName: \"kubernetes.io/projected/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-kube-api-access-x7g2j\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719872 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-ca\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719904 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-serving-cert\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719924 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqjlf\" (UniqueName: \"kubernetes.io/projected/cfa87f04-40c6-4575-b647-fb13a115b81d-kube-api-access-bqjlf\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719942 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-config\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.719969 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba450d5b-c962-4788-8215-d1eb12f9b314-config\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720104 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720138 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720361 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720522 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720591 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720648 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720688 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720767 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720784 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720867 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720879 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720873 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720964 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.720979 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.721111 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.721991 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.721116 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.721162 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.721914 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.722551 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.722837 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.723506 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.723863 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-5gdgj"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.724604 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725023 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.724994 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725124 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725456 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725466 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725618 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725650 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725813 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725865 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725952 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.725881 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.726486 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.726677 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.727163 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.727870 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.735766 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.736849 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-sqrz5"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.737397 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.737553 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.738022 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.747731 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.759037 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.759058 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.759123 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.759138 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.769962 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.771586 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.772021 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.772024 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.772152 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.772614 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.773475 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-45vfv"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.773734 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.774021 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.774357 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.774452 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.774873 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.782490 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.783129 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.783674 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dbgm"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.784543 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.785340 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.784558 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.792444 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.793464 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.794600 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.794809 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.795788 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.800693 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.801429 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vk2rn"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.801678 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.803391 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.803483 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.808798 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.815803 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.816211 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.817486 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9ktqw"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.817878 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x4x44"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.818095 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.818402 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.818574 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.818761 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.819343 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.819707 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.819883 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566738-5cj22"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.820025 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.820297 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-5cj22" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.820776 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.820875 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfa87f04-40c6-4575-b647-fb13a115b81d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.820954 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfa87f04-40c6-4575-b647-fb13a115b81d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821036 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsxn6\" (UniqueName: \"kubernetes.io/projected/24a452b3-94a8-4c29-8409-cb1a8dd11555-kube-api-access-rsxn6\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821128 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh6md\" (UniqueName: \"kubernetes.io/projected/37972326-b1df-484f-ab10-9c595b145d8c-kube-api-access-rh6md\") pod \"migrator-59844c95c7-lcdbx\" (UID: \"37972326-b1df-484f-ab10-9c595b145d8c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821249 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d62cba0d-d390-4638-aa42-59631e4bf118-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821360 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-serving-cert\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821475 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqnwc\" (UniqueName: \"kubernetes.io/projected/3587f3ba-577b-425a-adf5-336a8977dcc5-kube-api-access-rqnwc\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821617 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-image-import-ca\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821727 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-serving-cert\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821846 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a58a54d3-d454-4503-8b70-0e78784efdfc-serving-cert\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.821968 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.822089 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-dir\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.822342 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.822465 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-stats-auth\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.822571 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-encryption-config\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.822792 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.822873 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823124 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkfzm\" (UniqueName: \"kubernetes.io/projected/a8f2eaf6-3749-4695-8df1-5972598c8ac6-kube-api-access-tkfzm\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823182 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-console-config\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823218 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823304 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d4ce1856-395a-4003-9642-61da7cbdd789-images\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823380 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7g2j\" (UniqueName: \"kubernetes.io/projected/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-kube-api-access-x7g2j\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823413 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-ca\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823448 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcrkt\" (UniqueName: \"kubernetes.io/projected/d582dc3e-7510-42be-aa3a-1d15b35c327c-kube-api-access-bcrkt\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823493 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-serving-cert\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823523 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lj2w\" (UniqueName: \"kubernetes.io/projected/f67156a9-f474-4d80-9789-ffbfcc9ec78b-kube-api-access-9lj2w\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823557 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqjlf\" (UniqueName: \"kubernetes.io/projected/cfa87f04-40c6-4575-b647-fb13a115b81d-kube-api-access-bqjlf\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823591 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823619 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-serving-cert\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823639 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfa87f04-40c6-4575-b647-fb13a115b81d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823650 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8wdj\" (UniqueName: \"kubernetes.io/projected/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-kube-api-access-v8wdj\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823558 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.823685 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-config\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824095 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d62cba0d-d390-4638-aa42-59631e4bf118-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824132 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-trusted-ca-bundle\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824265 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-ca\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824430 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba450d5b-c962-4788-8215-d1eb12f9b314-config\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824589 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824633 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-oauth-config\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824659 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-etcd-client\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824683 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2f2e3391-68a2-43a8-aba9-17e583066b03-machine-approver-tls\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824703 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6r94\" (UniqueName: \"kubernetes.io/projected/2f2e3391-68a2-43a8-aba9-17e583066b03-kube-api-access-r6r94\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824790 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824815 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f15060fa-5a28-4a12-be7b-2823e921eb90-serving-cert\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824837 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-trusted-ca\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.824883 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqqmp\" (UniqueName: \"kubernetes.io/projected/d4ce1856-395a-4003-9642-61da7cbdd789-kube-api-access-zqqmp\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825018 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-config\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825050 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-encryption-config\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825162 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4ce1856-395a-4003-9642-61da7cbdd789-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825289 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24a452b3-94a8-4c29-8409-cb1a8dd11555-proxy-tls\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825319 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-audit\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825351 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-default-certificate\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825392 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-config\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825437 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-client-ca\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825486 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba450d5b-c962-4788-8215-d1eb12f9b314-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825522 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-config\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825604 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d88n2\" (UniqueName: \"kubernetes.io/projected/f15060fa-5a28-4a12-be7b-2823e921eb90-kube-api-access-d88n2\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825632 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.825965 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826057 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hfxcc"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826208 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826636 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6c5c\" (UniqueName: \"kubernetes.io/projected/a58a54d3-d454-4503-8b70-0e78784efdfc-kube-api-access-v6c5c\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826691 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826732 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826766 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk8bz\" (UniqueName: \"kubernetes.io/projected/e8ca532e-b0d7-494c-886f-bff0c8009707-kube-api-access-lk8bz\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826796 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9krs\" (UniqueName: \"kubernetes.io/projected/0eff7ea5-251b-44de-b129-c604349d6e6c-kube-api-access-c9krs\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826844 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f67156a9-f474-4d80-9789-ffbfcc9ec78b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826876 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826908 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826935 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-oauth-serving-cert\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.826965 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-serving-cert\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827142 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-client\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827194 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-etcd-serving-ca\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827245 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a8f2eaf6-3749-4695-8df1-5972598c8ac6-audit-dir\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827274 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d582dc3e-7510-42be-aa3a-1d15b35c327c-service-ca-bundle\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827502 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhgh4"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827757 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-config\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827819 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24a452b3-94a8-4c29-8409-cb1a8dd11555-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827873 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-etcd-client\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.827897 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a219c6-74dc-4511-867e-cf2fce301cad-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828045 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-client-ca\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828140 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcm82\" (UniqueName: \"kubernetes.io/projected/ba450d5b-c962-4788-8215-d1eb12f9b314-kube-api-access-vcm82\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828167 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ce1856-395a-4003-9642-61da7cbdd789-config\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828214 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-policies\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828257 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d62cba0d-d390-4638-aa42-59631e4bf118-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828443 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-service-ca\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828552 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-config\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.828956 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a8f2eaf6-3749-4695-8df1-5972598c8ac6-node-pullsecrets\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829028 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwq9m\" (UniqueName: \"kubernetes.io/projected/c9dab77c-3c60-4c91-8c0a-31791124462d-kube-api-access-rwq9m\") pod \"dns-operator-744455d44c-6xl8q\" (UID: \"c9dab77c-3c60-4c91-8c0a-31791124462d\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829049 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f2e3391-68a2-43a8-aba9-17e583066b03-config\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829071 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1a219c6-74dc-4511-867e-cf2fce301cad-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829088 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0eff7ea5-251b-44de-b129-c604349d6e6c-audit-dir\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829104 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f2e3391-68a2-43a8-aba9-17e583066b03-auth-proxy-config\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829444 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-service-ca\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829583 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba450d5b-c962-4788-8215-d1eb12f9b314-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829128 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829659 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-client-ca\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829678 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-metrics-certs\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829697 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-service-ca\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829721 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829740 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d989g\" (UniqueName: \"kubernetes.io/projected/d62cba0d-d390-4638-aa42-59631e4bf118-kube-api-access-d989g\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829925 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-serving-cert\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829934 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829948 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9dab77c-3c60-4c91-8c0a-31791124462d-metrics-tls\") pod \"dns-operator-744455d44c-6xl8q\" (UID: \"c9dab77c-3c60-4c91-8c0a-31791124462d\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.829981 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a219c6-74dc-4511-867e-cf2fce301cad-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.830032 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-audit-policies\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.830051 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f67156a9-f474-4d80-9789-ffbfcc9ec78b-serving-cert\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.831511 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a58a54d3-d454-4503-8b70-0e78784efdfc-etcd-client\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.832190 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-config\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.835852 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.836823 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba450d5b-c962-4788-8215-d1eb12f9b314-config\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.840930 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a58a54d3-d454-4503-8b70-0e78784efdfc-serving-cert\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.842520 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7xnrh"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.843985 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6xl8q"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.846018 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.847129 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.848632 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jffj8"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.849635 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.850141 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.851140 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vpp2k"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.852275 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.854719 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-srz5x"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.855146 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfa87f04-40c6-4575-b647-fb13a115b81d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.855337 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.859651 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-sqrz5"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.859694 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2k58g"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.865976 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pc5tf"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.866139 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.875384 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wt65f"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.877343 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.879358 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.881477 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-45vfv"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.884021 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.888690 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.888767 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.889753 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.891203 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.892050 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.893628 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-5cj22"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.894139 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.895170 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-l2xf5"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.896412 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.896531 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.897471 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.898482 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-x4xrf"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.899092 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.899771 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.901261 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dbgm"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.902006 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.903119 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vk2rn"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.904339 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.906174 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.906997 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.908188 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-s52jd"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.909379 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.910988 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9ktqw"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.911074 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.912301 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.913717 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2k58g"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.914872 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wt65f"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.916320 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l2xf5"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.917073 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x4x44"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.918337 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhgh4"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.919257 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jffj8"] Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.928344 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.931266 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0eff7ea5-251b-44de-b129-c604349d6e6c-audit-dir\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.931349 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.931418 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-service-ca\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.931460 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d989g\" (UniqueName: \"kubernetes.io/projected/d62cba0d-d390-4638-aa42-59631e4bf118-kube-api-access-d989g\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.931577 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-service-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.931621 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f67156a9-f474-4d80-9789-ffbfcc9ec78b-serving-cert\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.931842 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-serving-cert\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.932012 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0eff7ea5-251b-44de-b129-c604349d6e6c-audit-dir\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.932941 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d62cba0d-d390-4638-aa42-59631e4bf118-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933055 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-image-import-ca\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933119 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-serving-cert\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933180 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqnwc\" (UniqueName: \"kubernetes.io/projected/3587f3ba-577b-425a-adf5-336a8977dcc5-kube-api-access-rqnwc\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933217 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933283 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-encryption-config\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933344 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-srv-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933376 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-config\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933438 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-dir\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933469 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933534 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-stats-auth\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933591 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rdtv\" (UniqueName: \"kubernetes.io/projected/8b0b480d-ae68-4b26-b9f8-6b3caef70971-kube-api-access-7rdtv\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933618 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933658 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-console-config\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933680 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d4ce1856-395a-4003-9642-61da7cbdd789-images\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933752 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933777 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcrkt\" (UniqueName: \"kubernetes.io/projected/d582dc3e-7510-42be-aa3a-1d15b35c327c-kube-api-access-bcrkt\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933862 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b0b480d-ae68-4b26-b9f8-6b3caef70971-serving-cert\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.933930 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-trusted-ca-bundle\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934006 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2f2e3391-68a2-43a8-aba9-17e583066b03-machine-approver-tls\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934032 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6r94\" (UniqueName: \"kubernetes.io/projected/2f2e3391-68a2-43a8-aba9-17e583066b03-kube-api-access-r6r94\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934092 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934098 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-image-import-ca\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934119 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f15060fa-5a28-4a12-be7b-2823e921eb90-serving-cert\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934183 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4ce1856-395a-4003-9642-61da7cbdd789-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934257 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-audit\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934290 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-config\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934354 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d88n2\" (UniqueName: \"kubernetes.io/projected/f15060fa-5a28-4a12-be7b-2823e921eb90-kube-api-access-d88n2\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934386 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2wm2\" (UniqueName: \"kubernetes.io/projected/b7c6fefc-e60e-423d-ad15-2e16173ae01b-kube-api-access-j2wm2\") pod \"multus-admission-controller-857f4d67dd-vk2rn\" (UID: \"b7c6fefc-e60e-423d-ad15-2e16173ae01b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934417 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcc7l\" (UniqueName: \"kubernetes.io/projected/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-kube-api-access-lcc7l\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934452 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934480 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9krs\" (UniqueName: \"kubernetes.io/projected/0eff7ea5-251b-44de-b129-c604349d6e6c-kube-api-access-c9krs\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934512 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-trusted-ca\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934554 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934583 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-serving-cert\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934609 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67f7l\" (UniqueName: \"kubernetes.io/projected/b2a9fedf-d226-4388-8432-b22efd3b74bb-kube-api-access-67f7l\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934653 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a8f2eaf6-3749-4695-8df1-5972598c8ac6-audit-dir\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934679 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d582dc3e-7510-42be-aa3a-1d15b35c327c-service-ca-bundle\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934711 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24a452b3-94a8-4c29-8409-cb1a8dd11555-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934746 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a219c6-74dc-4511-867e-cf2fce301cad-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.934998 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s682l\" (UniqueName: \"kubernetes.io/projected/1414be44-7a88-4f16-9653-51a5793bd729-kube-api-access-s682l\") pod \"cluster-samples-operator-665b6dd947-qgkd4\" (UID: \"1414be44-7a88-4f16-9653-51a5793bd729\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935023 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1414be44-7a88-4f16-9653-51a5793bd729-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qgkd4\" (UID: \"1414be44-7a88-4f16-9653-51a5793bd729\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935092 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b2a9fedf-d226-4388-8432-b22efd3b74bb-signing-key\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935155 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f2e3391-68a2-43a8-aba9-17e583066b03-config\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935183 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f2e3391-68a2-43a8-aba9-17e583066b03-auth-proxy-config\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935252 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1a219c6-74dc-4511-867e-cf2fce301cad-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935309 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-client-ca\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935336 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-metrics-certs\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935392 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935428 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9dab77c-3c60-4c91-8c0a-31791124462d-metrics-tls\") pod \"dns-operator-744455d44c-6xl8q\" (UID: \"c9dab77c-3c60-4c91-8c0a-31791124462d\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935495 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efda2c60-f018-417a-a73d-2727be57b558-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935555 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a219c6-74dc-4511-867e-cf2fce301cad-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935553 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a8f2eaf6-3749-4695-8df1-5972598c8ac6-audit-dir\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935592 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-audit-policies\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935665 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsxn6\" (UniqueName: \"kubernetes.io/projected/24a452b3-94a8-4c29-8409-cb1a8dd11555-kube-api-access-rsxn6\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935701 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh6md\" (UniqueName: \"kubernetes.io/projected/37972326-b1df-484f-ab10-9c595b145d8c-kube-api-access-rh6md\") pod \"migrator-59844c95c7-lcdbx\" (UID: \"37972326-b1df-484f-ab10-9c595b145d8c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935730 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b7c6fefc-e60e-423d-ad15-2e16173ae01b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vk2rn\" (UID: \"b7c6fefc-e60e-423d-ad15-2e16173ae01b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935757 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9nzq\" (UniqueName: \"kubernetes.io/projected/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-kube-api-access-s9nzq\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935789 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-profile-collector-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935819 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkfzm\" (UniqueName: \"kubernetes.io/projected/a8f2eaf6-3749-4695-8df1-5972598c8ac6-kube-api-access-tkfzm\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935848 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935909 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lj2w\" (UniqueName: \"kubernetes.io/projected/f67156a9-f474-4d80-9789-ffbfcc9ec78b-kube-api-access-9lj2w\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935938 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-serving-cert\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935967 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8wdj\" (UniqueName: \"kubernetes.io/projected/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-kube-api-access-v8wdj\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935975 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.935992 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936054 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d62cba0d-d390-4638-aa42-59631e4bf118-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936094 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936157 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-oauth-config\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936242 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-etcd-client\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936340 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-trusted-ca\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936405 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqqmp\" (UniqueName: \"kubernetes.io/projected/d4ce1856-395a-4003-9642-61da7cbdd789-kube-api-access-zqqmp\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936467 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-config\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936500 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-encryption-config\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936575 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24a452b3-94a8-4c29-8409-cb1a8dd11555-proxy-tls\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936645 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-default-certificate\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936689 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-config\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936758 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ckhx\" (UniqueName: \"kubernetes.io/projected/825c6b77-c03a-463c-b9a4-d26a1ac398f0-kube-api-access-2ckhx\") pod \"downloads-7954f5f757-45vfv\" (UID: \"825c6b77-c03a-463c-b9a4-d26a1ac398f0\") " pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936820 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-metrics-tls\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936892 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936909 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936933 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgncr\" (UniqueName: \"kubernetes.io/projected/efda2c60-f018-417a-a73d-2727be57b558-kube-api-access-qgncr\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.936998 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.937056 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk8bz\" (UniqueName: \"kubernetes.io/projected/e8ca532e-b0d7-494c-886f-bff0c8009707-kube-api-access-lk8bz\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.937071 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d4ce1856-395a-4003-9642-61da7cbdd789-images\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.937104 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f67156a9-f474-4d80-9789-ffbfcc9ec78b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.937171 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-oauth-serving-cert\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.937253 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.937326 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-etcd-serving-ca\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.937389 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.939625 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a219c6-74dc-4511-867e-cf2fce301cad-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.939864 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-serving-cert\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.939912 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-config\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.941209 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-audit-policies\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.941595 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4ce1856-395a-4003-9642-61da7cbdd789-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.942269 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9dab77c-3c60-4c91-8c0a-31791124462d-metrics-tls\") pod \"dns-operator-744455d44c-6xl8q\" (UID: \"c9dab77c-3c60-4c91-8c0a-31791124462d\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.942624 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-trusted-ca\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.942992 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24a452b3-94a8-4c29-8409-cb1a8dd11555-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.943203 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.943515 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-etcd-serving-ca\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.943805 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f67156a9-f474-4d80-9789-ffbfcc9ec78b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944071 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f15060fa-5a28-4a12-be7b-2823e921eb90-serving-cert\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944148 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-audit\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944182 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-dir\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944206 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f2e3391-68a2-43a8-aba9-17e583066b03-config\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944313 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f2e3391-68a2-43a8-aba9-17e583066b03-auth-proxy-config\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944330 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944719 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944746 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.944879 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-stats-auth\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945163 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945163 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945450 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8f2eaf6-3749-4695-8df1-5972598c8ac6-config\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945548 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-etcd-client\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945585 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ce1856-395a-4003-9642-61da7cbdd789-config\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945614 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b2a9fedf-d226-4388-8432-b22efd3b74bb-signing-cabundle\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945817 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945907 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d62cba0d-d390-4638-aa42-59631e4bf118-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945940 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-policies\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945972 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a8f2eaf6-3749-4695-8df1-5972598c8ac6-node-pullsecrets\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.945996 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d582dc3e-7510-42be-aa3a-1d15b35c327c-service-ca-bundle\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.946069 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a8f2eaf6-3749-4695-8df1-5972598c8ac6-node-pullsecrets\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.946324 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efda2c60-f018-417a-a73d-2727be57b558-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.946697 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-config\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.946832 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eff7ea5-251b-44de-b129-c604349d6e6c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.946902 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-policies\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.946918 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwq9m\" (UniqueName: \"kubernetes.io/projected/c9dab77c-3c60-4c91-8c0a-31791124462d-kube-api-access-rwq9m\") pod \"dns-operator-744455d44c-6xl8q\" (UID: \"c9dab77c-3c60-4c91-8c0a-31791124462d\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.947104 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-client-ca\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.947376 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-metrics-certs\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.947866 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-encryption-config\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.948390 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.948467 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-serving-cert\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.949558 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.949774 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a219c6-74dc-4511-867e-cf2fce301cad-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.949848 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-serving-cert\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.950173 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.950522 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d62cba0d-d390-4638-aa42-59631e4bf118-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.950644 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.950927 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24a452b3-94a8-4c29-8409-cb1a8dd11555-proxy-tls\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.951100 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ce1856-395a-4003-9642-61da7cbdd789-config\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.951311 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-encryption-config\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.951421 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d62cba0d-d390-4638-aa42-59631e4bf118-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.951948 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8f2eaf6-3749-4695-8df1-5972598c8ac6-etcd-client\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.952211 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.952298 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d582dc3e-7510-42be-aa3a-1d15b35c327c-default-certificate\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.952595 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2f2e3391-68a2-43a8-aba9-17e583066b03-machine-approver-tls\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.953636 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f67156a9-f474-4d80-9789-ffbfcc9ec78b-serving-cert\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.954182 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0eff7ea5-251b-44de-b129-c604349d6e6c-etcd-client\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.956624 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-serving-cert\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.968041 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.981039 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-oauth-config\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:19 crc kubenswrapper[4860]: I0320 10:58:19.989105 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.008077 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.036982 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.047692 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b2a9fedf-d226-4388-8432-b22efd3b74bb-signing-cabundle\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.047735 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efda2c60-f018-417a-a73d-2727be57b558-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.047784 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-service-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.048321 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-srv-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.048369 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-config\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.048404 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rdtv\" (UniqueName: \"kubernetes.io/projected/8b0b480d-ae68-4b26-b9f8-6b3caef70971-kube-api-access-7rdtv\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.048441 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.048497 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.048511 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b0b480d-ae68-4b26-b9f8-6b3caef70971-serving-cert\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049286 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2wm2\" (UniqueName: \"kubernetes.io/projected/b7c6fefc-e60e-423d-ad15-2e16173ae01b-kube-api-access-j2wm2\") pod \"multus-admission-controller-857f4d67dd-vk2rn\" (UID: \"b7c6fefc-e60e-423d-ad15-2e16173ae01b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049428 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-trusted-ca-bundle\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049326 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcc7l\" (UniqueName: \"kubernetes.io/projected/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-kube-api-access-lcc7l\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049556 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-trusted-ca\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049611 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67f7l\" (UniqueName: \"kubernetes.io/projected/b2a9fedf-d226-4388-8432-b22efd3b74bb-kube-api-access-67f7l\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049669 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s682l\" (UniqueName: \"kubernetes.io/projected/1414be44-7a88-4f16-9653-51a5793bd729-kube-api-access-s682l\") pod \"cluster-samples-operator-665b6dd947-qgkd4\" (UID: \"1414be44-7a88-4f16-9653-51a5793bd729\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049705 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1414be44-7a88-4f16-9653-51a5793bd729-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qgkd4\" (UID: \"1414be44-7a88-4f16-9653-51a5793bd729\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.049984 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b2a9fedf-d226-4388-8432-b22efd3b74bb-signing-key\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.050078 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efda2c60-f018-417a-a73d-2727be57b558-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.050160 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b7c6fefc-e60e-423d-ad15-2e16173ae01b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vk2rn\" (UID: \"b7c6fefc-e60e-423d-ad15-2e16173ae01b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.050197 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9nzq\" (UniqueName: \"kubernetes.io/projected/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-kube-api-access-s9nzq\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.050534 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-profile-collector-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.050792 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ckhx\" (UniqueName: \"kubernetes.io/projected/825c6b77-c03a-463c-b9a4-d26a1ac398f0-kube-api-access-2ckhx\") pod \"downloads-7954f5f757-45vfv\" (UID: \"825c6b77-c03a-463c-b9a4-d26a1ac398f0\") " pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.050849 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-metrics-tls\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.051162 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgncr\" (UniqueName: \"kubernetes.io/projected/efda2c60-f018-417a-a73d-2727be57b558-kube-api-access-qgncr\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.051274 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.058439 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-oauth-serving-cert\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.068121 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.073797 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-service-ca\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.088383 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.090146 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-console-config\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.108219 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.129025 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.148506 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.168805 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.187942 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.215929 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.222163 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-trusted-ca\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.228791 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.236289 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-metrics-tls\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.248988 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.268357 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.288188 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.307309 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.328423 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.348297 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.368403 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.389882 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.409806 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.429683 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.448806 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.468925 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.489047 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.504981 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1414be44-7a88-4f16-9653-51a5793bd729-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qgkd4\" (UID: \"1414be44-7a88-4f16-9653-51a5793bd729\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.508445 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.529660 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.549559 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.568725 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.590150 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.608871 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.628817 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.648028 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.668184 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.688678 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.695755 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b7c6fefc-e60e-423d-ad15-2e16173ae01b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vk2rn\" (UID: \"b7c6fefc-e60e-423d-ad15-2e16173ae01b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.708454 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.710638 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efda2c60-f018-417a-a73d-2727be57b558-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.729508 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.735821 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efda2c60-f018-417a-a73d-2727be57b558-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.749527 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.768535 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.789153 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.808136 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.826026 4860 request.go:700] Waited for 1.009372347s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.827320 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.848934 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.869203 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.888752 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.909035 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.929132 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.949156 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.968861 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.988486 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 10:58:20 crc kubenswrapper[4860]: I0320 10:58:20.989630 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b2a9fedf-d226-4388-8432-b22efd3b74bb-signing-cabundle\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.008520 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.014155 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b2a9fedf-d226-4388-8432-b22efd3b74bb-signing-key\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.029261 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.047870 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.048969 4860 secret.go:188] Couldn't get secret openshift-authentication-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.049137 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b0b480d-ae68-4b26-b9f8-6b3caef70971-serving-cert podName:8b0b480d-ae68-4b26-b9f8-6b3caef70971 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:21.549113985 +0000 UTC m=+225.770474893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8b0b480d-ae68-4b26-b9f8-6b3caef70971-serving-cert") pod "authentication-operator-69f744f599-jffj8" (UID: "8b0b480d-ae68-4b26-b9f8-6b3caef70971") : failed to sync secret cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.048968 4860 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.049422 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-srv-cert podName:8fe93f79-239c-4b6a-bd22-bbdf55aff0af nodeName:}" failed. No retries permitted until 2026-03-20 10:58:21.549408973 +0000 UTC m=+225.770769881 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-srv-cert") pod "catalog-operator-68c6474976-gf5nr" (UID: "8fe93f79-239c-4b6a-bd22-bbdf55aff0af") : failed to sync secret cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.049032 4860 configmap.go:193] Couldn't get configMap openshift-authentication-operator/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.049604 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-service-ca-bundle podName:8b0b480d-ae68-4b26-b9f8-6b3caef70971 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:21.549593418 +0000 UTC m=+225.770954326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-service-ca-bundle") pod "authentication-operator-69f744f599-jffj8" (UID: "8b0b480d-ae68-4b26-b9f8-6b3caef70971") : failed to sync configmap cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.049301 4860 configmap.go:193] Couldn't get configMap openshift-authentication-operator/authentication-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.049824 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-config podName:8b0b480d-ae68-4b26-b9f8-6b3caef70971 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:21.549808634 +0000 UTC m=+225.771169542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-config") pod "authentication-operator-69f744f599-jffj8" (UID: "8b0b480d-ae68-4b26-b9f8-6b3caef70971") : failed to sync configmap cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.052504 4860 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.052571 4860 configmap.go:193] Couldn't get configMap openshift-authentication-operator/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.052739 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-profile-collector-cert podName:8fe93f79-239c-4b6a-bd22-bbdf55aff0af nodeName:}" failed. No retries permitted until 2026-03-20 10:58:21.552723995 +0000 UTC m=+225.774084903 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-profile-collector-cert") pod "catalog-operator-68c6474976-gf5nr" (UID: "8fe93f79-239c-4b6a-bd22-bbdf55aff0af") : failed to sync secret cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: E0320 10:58:21.052871 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-trusted-ca-bundle podName:8b0b480d-ae68-4b26-b9f8-6b3caef70971 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:21.552853109 +0000 UTC m=+225.774214017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-trusted-ca-bundle") pod "authentication-operator-69f744f599-jffj8" (UID: "8b0b480d-ae68-4b26-b9f8-6b3caef70971") : failed to sync configmap cache: timed out waiting for the condition Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.068170 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.088479 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.108948 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.129288 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.149938 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.169704 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.217220 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7g2j\" (UniqueName: \"kubernetes.io/projected/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-kube-api-access-x7g2j\") pod \"controller-manager-879f6c89f-7xnrh\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.229929 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.233948 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqjlf\" (UniqueName: \"kubernetes.io/projected/cfa87f04-40c6-4575-b647-fb13a115b81d-kube-api-access-bqjlf\") pod \"openshift-controller-manager-operator-756b6f6bc6-5pq7k\" (UID: \"cfa87f04-40c6-4575-b647-fb13a115b81d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.280672 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6c5c\" (UniqueName: \"kubernetes.io/projected/a58a54d3-d454-4503-8b70-0e78784efdfc-kube-api-access-v6c5c\") pod \"etcd-operator-b45778765-hfxcc\" (UID: \"a58a54d3-d454-4503-8b70-0e78784efdfc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.283848 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcm82\" (UniqueName: \"kubernetes.io/projected/ba450d5b-c962-4788-8215-d1eb12f9b314-kube-api-access-vcm82\") pod \"openshift-apiserver-operator-796bbdcf4f-76g8f\" (UID: \"ba450d5b-c962-4788-8215-d1eb12f9b314\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.288167 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.323549 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.330593 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.348321 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.368518 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.388202 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.408936 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.428765 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.448583 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.458657 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.475567 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.480521 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.483018 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.488257 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.509559 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.526460 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.551246 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.570113 4860 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.581507 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-profile-collector-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.581631 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.581688 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-service-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.581720 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-srv-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.581737 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-config\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.581771 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b0b480d-ae68-4b26-b9f8-6b3caef70971-serving-cert\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.583654 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-service-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.584108 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.584196 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b0b480d-ae68-4b26-b9f8-6b3caef70971-config\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.586892 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b0b480d-ae68-4b26-b9f8-6b3caef70971-serving-cert\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.586983 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-srv-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.587469 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.589192 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-profile-collector-cert\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.628890 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.648193 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.669697 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.688761 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.692177 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f"] Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.708308 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.726878 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hfxcc"] Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.735387 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.750171 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.769265 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.775253 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k"] Mar 20 10:58:21 crc kubenswrapper[4860]: W0320 10:58:21.787061 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfa87f04_40c6_4575_b647_fb13a115b81d.slice/crio-5b41337622fad4321f7bb86fe4803fd1cf32c952050c583a599de54f99215a27 WatchSource:0}: Error finding container 5b41337622fad4321f7bb86fe4803fd1cf32c952050c583a599de54f99215a27: Status 404 returned error can't find the container with id 5b41337622fad4321f7bb86fe4803fd1cf32c952050c583a599de54f99215a27 Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.787845 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.809258 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.827014 4860 request.go:700] Waited for 1.894313224s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.850037 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d989g\" (UniqueName: \"kubernetes.io/projected/d62cba0d-d390-4638-aa42-59631e4bf118-kube-api-access-d989g\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.870520 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqnwc\" (UniqueName: \"kubernetes.io/projected/3587f3ba-577b-425a-adf5-336a8977dcc5-kube-api-access-rqnwc\") pod \"oauth-openshift-558db77b4-srz5x\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.887294 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcrkt\" (UniqueName: \"kubernetes.io/projected/d582dc3e-7510-42be-aa3a-1d15b35c327c-kube-api-access-bcrkt\") pod \"router-default-5444994796-5gdgj\" (UID: \"d582dc3e-7510-42be-aa3a-1d15b35c327c\") " pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.904262 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d88n2\" (UniqueName: \"kubernetes.io/projected/f15060fa-5a28-4a12-be7b-2823e921eb90-kube-api-access-d88n2\") pod \"route-controller-manager-6576b87f9c-nxq82\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.923422 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9krs\" (UniqueName: \"kubernetes.io/projected/0eff7ea5-251b-44de-b129-c604349d6e6c-kube-api-access-c9krs\") pod \"apiserver-7bbb656c7d-twkfs\" (UID: \"0eff7ea5-251b-44de-b129-c604349d6e6c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.927078 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.942816 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6r94\" (UniqueName: \"kubernetes.io/projected/2f2e3391-68a2-43a8-aba9-17e583066b03-kube-api-access-r6r94\") pod \"machine-approver-56656f9798-hz7zj\" (UID: \"2f2e3391-68a2-43a8-aba9-17e583066b03\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.965377 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7xnrh"] Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.967456 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsxn6\" (UniqueName: \"kubernetes.io/projected/24a452b3-94a8-4c29-8409-cb1a8dd11555-kube-api-access-rsxn6\") pod \"machine-config-controller-84d6567774-pr8t8\" (UID: \"24a452b3-94a8-4c29-8409-cb1a8dd11555\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:21 crc kubenswrapper[4860]: W0320 10:58:21.978281 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ef8eec8_b86d_4f5a_931e_c76e11c07f94.slice/crio-2980d18f8ba28b033dbf5d36b160c7efad5ae1c6664cb1dba27569a3bcc37db2 WatchSource:0}: Error finding container 2980d18f8ba28b033dbf5d36b160c7efad5ae1c6664cb1dba27569a3bcc37db2: Status 404 returned error can't find the container with id 2980d18f8ba28b033dbf5d36b160c7efad5ae1c6664cb1dba27569a3bcc37db2 Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.989019 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh6md\" (UniqueName: \"kubernetes.io/projected/37972326-b1df-484f-ab10-9c595b145d8c-kube-api-access-rh6md\") pod \"migrator-59844c95c7-lcdbx\" (UID: \"37972326-b1df-484f-ab10-9c595b145d8c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" Mar 20 10:58:21 crc kubenswrapper[4860]: I0320 10:58:21.996747 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.004187 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.004440 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkfzm\" (UniqueName: \"kubernetes.io/projected/a8f2eaf6-3749-4695-8df1-5972598c8ac6-kube-api-access-tkfzm\") pod \"apiserver-76f77b778f-vpp2k\" (UID: \"a8f2eaf6-3749-4695-8df1-5972598c8ac6\") " pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.019992 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.023951 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8wdj\" (UniqueName: \"kubernetes.io/projected/7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e-kube-api-access-v8wdj\") pod \"console-operator-58897d9998-pc5tf\" (UID: \"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e\") " pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.032939 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.050364 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lj2w\" (UniqueName: \"kubernetes.io/projected/f67156a9-f474-4d80-9789-ffbfcc9ec78b-kube-api-access-9lj2w\") pod \"openshift-config-operator-7777fb866f-xmkqm\" (UID: \"f67156a9-f474-4d80-9789-ffbfcc9ec78b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.066288 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1a219c6-74dc-4511-867e-cf2fce301cad-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wv5fd\" (UID: \"f1a219c6-74dc-4511-867e-cf2fce301cad\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.089763 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d62cba0d-d390-4638-aa42-59631e4bf118-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4429p\" (UID: \"d62cba0d-d390-4638-aa42-59631e4bf118\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:22 crc kubenswrapper[4860]: W0320 10:58:22.095947 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd582dc3e_7510_42be_aa3a_1d15b35c327c.slice/crio-4007d87bec9e8cbbb2f96fba62cbd83fc23bd3a2fceb0d216c6cc21d5756432d WatchSource:0}: Error finding container 4007d87bec9e8cbbb2f96fba62cbd83fc23bd3a2fceb0d216c6cc21d5756432d: Status 404 returned error can't find the container with id 4007d87bec9e8cbbb2f96fba62cbd83fc23bd3a2fceb0d216c6cc21d5756432d Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.112907 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk8bz\" (UniqueName: \"kubernetes.io/projected/e8ca532e-b0d7-494c-886f-bff0c8009707-kube-api-access-lk8bz\") pod \"console-f9d7485db-sqrz5\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.126742 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs"] Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.133665 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqqmp\" (UniqueName: \"kubernetes.io/projected/d4ce1856-395a-4003-9642-61da7cbdd789-kube-api-access-zqqmp\") pod \"machine-api-operator-5694c8668f-s52jd\" (UID: \"d4ce1856-395a-4003-9642-61da7cbdd789\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.141484 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.150983 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwq9m\" (UniqueName: \"kubernetes.io/projected/c9dab77c-3c60-4c91-8c0a-31791124462d-kube-api-access-rwq9m\") pod \"dns-operator-744455d44c-6xl8q\" (UID: \"c9dab77c-3c60-4c91-8c0a-31791124462d\") " pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.169909 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rdtv\" (UniqueName: \"kubernetes.io/projected/8b0b480d-ae68-4b26-b9f8-6b3caef70971-kube-api-access-7rdtv\") pod \"authentication-operator-69f744f599-jffj8\" (UID: \"8b0b480d-ae68-4b26-b9f8-6b3caef70971\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.174816 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.186780 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.194012 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.200772 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.217542 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2wm2\" (UniqueName: \"kubernetes.io/projected/b7c6fefc-e60e-423d-ad15-2e16173ae01b-kube-api-access-j2wm2\") pod \"multus-admission-controller-857f4d67dd-vk2rn\" (UID: \"b7c6fefc-e60e-423d-ad15-2e16173ae01b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.218520 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.240600 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.256464 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcc7l\" (UniqueName: \"kubernetes.io/projected/d09bb09c-7ad0-4971-b6a2-1b37bff617b5-kube-api-access-lcc7l\") pod \"ingress-operator-5b745b69d9-rtgqn\" (UID: \"d09bb09c-7ad0-4971-b6a2-1b37bff617b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.259554 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82"] Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.267021 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67f7l\" (UniqueName: \"kubernetes.io/projected/b2a9fedf-d226-4388-8432-b22efd3b74bb-kube-api-access-67f7l\") pod \"service-ca-9c57cc56f-9ktqw\" (UID: \"b2a9fedf-d226-4388-8432-b22efd3b74bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.267603 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s682l\" (UniqueName: \"kubernetes.io/projected/1414be44-7a88-4f16-9653-51a5793bd729-kube-api-access-s682l\") pod \"cluster-samples-operator-665b6dd947-qgkd4\" (UID: \"1414be44-7a88-4f16-9653-51a5793bd729\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.273605 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.282856 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.290983 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.314334 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.317603 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9nzq\" (UniqueName: \"kubernetes.io/projected/8fe93f79-239c-4b6a-bd22-bbdf55aff0af-kube-api-access-s9nzq\") pod \"catalog-operator-68c6474976-gf5nr\" (UID: \"8fe93f79-239c-4b6a-bd22-bbdf55aff0af\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.322415 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ckhx\" (UniqueName: \"kubernetes.io/projected/825c6b77-c03a-463c-b9a4-d26a1ac398f0-kube-api-access-2ckhx\") pod \"downloads-7954f5f757-45vfv\" (UID: \"825c6b77-c03a-463c-b9a4-d26a1ac398f0\") " pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.346947 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgncr\" (UniqueName: \"kubernetes.io/projected/efda2c60-f018-417a-a73d-2727be57b558-kube-api-access-qgncr\") pod \"kube-storage-version-migrator-operator-b67b599dd-6stjq\" (UID: \"efda2c60-f018-417a-a73d-2727be57b558\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.348570 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.348644 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.352031 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx"] Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.360929 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.375017 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.387571 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395615 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/34043403-110c-4547-81a4-7af1429878cd-tmpfs\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395673 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-images\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395699 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-config\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395770 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395804 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnfgq\" (UniqueName: \"kubernetes.io/projected/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-kube-api-access-xnfgq\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395823 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/437c32d4-4b5f-4657-86d6-5214e3bfc01f-secret-volume\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395845 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bc351c5-b724-443e-a7e2-f4abba352cef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnqsw\" (UID: \"8bc351c5-b724-443e-a7e2-f4abba352cef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:22 crc kubenswrapper[4860]: E0320 10:58:22.396466 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:22.896445783 +0000 UTC m=+227.117806681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.395915 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gvm7\" (UniqueName: \"kubernetes.io/projected/628f2025-d050-42a9-bf56-9daa0e5c001b-kube-api-access-6gvm7\") pod \"package-server-manager-789f6589d5-mcqdw\" (UID: \"628f2025-d050-42a9-bf56-9daa0e5c001b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.397742 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39b41087-226b-4f73-9fc4-64616b430f2d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.397763 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n92tj\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-kube-api-access-n92tj\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.397808 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/34043403-110c-4547-81a4-7af1429878cd-webhook-cert\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.398081 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/628f2025-d050-42a9-bf56-9daa0e5c001b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mcqdw\" (UID: \"628f2025-d050-42a9-bf56-9daa0e5c001b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.398111 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-registry-certificates\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.398130 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39b41087-226b-4f73-9fc4-64616b430f2d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.398150 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpk7d\" (UniqueName: \"kubernetes.io/projected/437c32d4-4b5f-4657-86d6-5214e3bfc01f-kube-api-access-wpk7d\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.398194 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8d5eff6-150c-4314-8ebc-38b3660ce01a-config\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.398962 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8d5eff6-150c-4314-8ebc-38b3660ce01a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.399002 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-trusted-ca\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.399019 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-proxy-tls\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.399037 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7mtp\" (UniqueName: \"kubernetes.io/projected/9d98ac55-cf65-4f72-805b-dd3da2742004-kube-api-access-p7mtp\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.399054 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.399110 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-registry-tls\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.399137 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.399205 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-248wx\" (UniqueName: \"kubernetes.io/projected/34043403-110c-4547-81a4-7af1429878cd-kube-api-access-248wx\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.400374 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9d98ac55-cf65-4f72-805b-dd3da2742004-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.400443 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5043c51-bd3f-461f-b011-a42ad38ed7d4-serving-cert\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.400467 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.400753 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqvlf\" (UniqueName: \"kubernetes.io/projected/403ca5f6-bd52-40de-88d6-5151b3202c76-kube-api-access-hqvlf\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.400800 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5043c51-bd3f-461f-b011-a42ad38ed7d4-config\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.400822 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8d5eff6-150c-4314-8ebc-38b3660ce01a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.400899 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-bound-sa-token\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401438 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fplbn\" (UniqueName: \"kubernetes.io/projected/f5043c51-bd3f-461f-b011-a42ad38ed7d4-kube-api-access-fplbn\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401476 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/437c32d4-4b5f-4657-86d6-5214e3bfc01f-config-volume\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401502 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9d98ac55-cf65-4f72-805b-dd3da2742004-srv-cert\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401537 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9q9t\" (UniqueName: \"kubernetes.io/projected/8bc351c5-b724-443e-a7e2-f4abba352cef-kube-api-access-p9q9t\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnqsw\" (UID: \"8bc351c5-b724-443e-a7e2-f4abba352cef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401559 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401648 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hjpq\" (UniqueName: \"kubernetes.io/projected/ba2ab33e-6ecc-4eac-9aaa-256e6ff68236-kube-api-access-5hjpq\") pod \"auto-csr-approver-29566738-5cj22\" (UID: \"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236\") " pod="openshift-infra/auto-csr-approver-29566738-5cj22" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401806 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.401851 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/34043403-110c-4547-81a4-7af1429878cd-apiservice-cert\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.414045 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.422562 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8"] Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.427674 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.437100 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.481634 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.484078 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-srz5x"] Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.502785 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.503829 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8d5eff6-150c-4314-8ebc-38b3660ce01a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.503885 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-trusted-ca\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.503904 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-proxy-tls\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.503936 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.503961 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/37b4c0fc-6a82-4f6b-85fc-233090358f9c-node-bootstrap-token\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.503980 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7mtp\" (UniqueName: \"kubernetes.io/projected/9d98ac55-cf65-4f72-805b-dd3da2742004-kube-api-access-p7mtp\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504014 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmjsg\" (UniqueName: \"kubernetes.io/projected/1241cd05-23d3-4e5a-9130-29e7638003a9-kube-api-access-gmjsg\") pod \"ingress-canary-wt65f\" (UID: \"1241cd05-23d3-4e5a-9130-29e7638003a9\") " pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504033 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwl24\" (UniqueName: \"kubernetes.io/projected/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-kube-api-access-zwl24\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504063 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-registry-tls\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504080 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504162 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-248wx\" (UniqueName: \"kubernetes.io/projected/34043403-110c-4547-81a4-7af1429878cd-kube-api-access-248wx\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504200 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9d98ac55-cf65-4f72-805b-dd3da2742004-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504279 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5043c51-bd3f-461f-b011-a42ad38ed7d4-serving-cert\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504303 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504324 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqvlf\" (UniqueName: \"kubernetes.io/projected/403ca5f6-bd52-40de-88d6-5151b3202c76-kube-api-access-hqvlf\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504357 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5043c51-bd3f-461f-b011-a42ad38ed7d4-config\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504384 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8d5eff6-150c-4314-8ebc-38b3660ce01a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504418 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-bound-sa-token\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504438 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fplbn\" (UniqueName: \"kubernetes.io/projected/f5043c51-bd3f-461f-b011-a42ad38ed7d4-kube-api-access-fplbn\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504495 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/437c32d4-4b5f-4657-86d6-5214e3bfc01f-config-volume\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504538 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-plugins-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504556 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-config-volume\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504574 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9d98ac55-cf65-4f72-805b-dd3da2742004-srv-cert\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504609 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504632 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-mountpoint-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504651 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9q9t\" (UniqueName: \"kubernetes.io/projected/8bc351c5-b724-443e-a7e2-f4abba352cef-kube-api-access-p9q9t\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnqsw\" (UID: \"8bc351c5-b724-443e-a7e2-f4abba352cef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504682 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hjpq\" (UniqueName: \"kubernetes.io/projected/ba2ab33e-6ecc-4eac-9aaa-256e6ff68236-kube-api-access-5hjpq\") pod \"auto-csr-approver-29566738-5cj22\" (UID: \"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236\") " pod="openshift-infra/auto-csr-approver-29566738-5cj22" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504699 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-csi-data-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504730 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504769 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/34043403-110c-4547-81a4-7af1429878cd-apiservice-cert\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504820 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-registration-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504862 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/34043403-110c-4547-81a4-7af1429878cd-tmpfs\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504903 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-images\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504917 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-config\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504933 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-metrics-tls\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504962 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/37b4c0fc-6a82-4f6b-85fc-233090358f9c-certs\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.504980 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-socket-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505036 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnfgq\" (UniqueName: \"kubernetes.io/projected/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-kube-api-access-xnfgq\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505056 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/437c32d4-4b5f-4657-86d6-5214e3bfc01f-secret-volume\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505077 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bc351c5-b724-443e-a7e2-f4abba352cef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnqsw\" (UID: \"8bc351c5-b724-443e-a7e2-f4abba352cef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505108 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1241cd05-23d3-4e5a-9130-29e7638003a9-cert\") pod \"ingress-canary-wt65f\" (UID: \"1241cd05-23d3-4e5a-9130-29e7638003a9\") " pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505145 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gvm7\" (UniqueName: \"kubernetes.io/projected/628f2025-d050-42a9-bf56-9daa0e5c001b-kube-api-access-6gvm7\") pod \"package-server-manager-789f6589d5-mcqdw\" (UID: \"628f2025-d050-42a9-bf56-9daa0e5c001b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505187 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh75p\" (UniqueName: \"kubernetes.io/projected/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-kube-api-access-zh75p\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505334 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39b41087-226b-4f73-9fc4-64616b430f2d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505360 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n92tj\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-kube-api-access-n92tj\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505383 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/34043403-110c-4547-81a4-7af1429878cd-webhook-cert\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505402 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/628f2025-d050-42a9-bf56-9daa0e5c001b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mcqdw\" (UID: \"628f2025-d050-42a9-bf56-9daa0e5c001b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505476 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-registry-certificates\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505497 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39b41087-226b-4f73-9fc4-64616b430f2d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505536 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpk7d\" (UniqueName: \"kubernetes.io/projected/437c32d4-4b5f-4657-86d6-5214e3bfc01f-kube-api-access-wpk7d\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505555 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtll7\" (UniqueName: \"kubernetes.io/projected/37b4c0fc-6a82-4f6b-85fc-233090358f9c-kube-api-access-qtll7\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.505579 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8d5eff6-150c-4314-8ebc-38b3660ce01a-config\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.506200 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm"] Mar 20 10:58:22 crc kubenswrapper[4860]: E0320 10:58:22.506539 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.006512743 +0000 UTC m=+227.227873641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.513938 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-images\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.514514 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-config\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.516170 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.516201 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8d5eff6-150c-4314-8ebc-38b3660ce01a-config\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.517928 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.518101 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-registry-certificates\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.518135 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/437c32d4-4b5f-4657-86d6-5214e3bfc01f-config-volume\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.518443 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/34043403-110c-4547-81a4-7af1429878cd-tmpfs\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.519206 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-trusted-ca\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.519462 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5043c51-bd3f-461f-b011-a42ad38ed7d4-config\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.519522 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39b41087-226b-4f73-9fc4-64616b430f2d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.520908 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8d5eff6-150c-4314-8ebc-38b3660ce01a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.520959 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.524793 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-proxy-tls\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.524991 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/437c32d4-4b5f-4657-86d6-5214e3bfc01f-secret-volume\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.525171 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/34043403-110c-4547-81a4-7af1429878cd-apiservice-cert\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.525395 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9d98ac55-cf65-4f72-805b-dd3da2742004-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.525980 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5043c51-bd3f-461f-b011-a42ad38ed7d4-serving-cert\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.526545 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/628f2025-d050-42a9-bf56-9daa0e5c001b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mcqdw\" (UID: \"628f2025-d050-42a9-bf56-9daa0e5c001b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.527411 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" event={"ID":"a58a54d3-d454-4503-8b70-0e78784efdfc","Type":"ContainerStarted","Data":"d7f22518b520d9c10d2d73fa5144d12a4e95e49e74395d5b60a03b895d44f408"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.527466 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" event={"ID":"a58a54d3-d454-4503-8b70-0e78784efdfc","Type":"ContainerStarted","Data":"8b9a8cb26cb885997ef61249d758bafd09ddc862da19b4fd6d054c4c5141458c"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.527718 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.528120 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bc351c5-b724-443e-a7e2-f4abba352cef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnqsw\" (UID: \"8bc351c5-b724-443e-a7e2-f4abba352cef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.528504 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9d98ac55-cf65-4f72-805b-dd3da2742004-srv-cert\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.529002 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/34043403-110c-4547-81a4-7af1429878cd-webhook-cert\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.531078 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5gdgj" event={"ID":"d582dc3e-7510-42be-aa3a-1d15b35c327c","Type":"ContainerStarted","Data":"955bb582cec92cd2ecd2c40b586780e0c9b8fdf6f046be17de2e9d5aa6119a42"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.531114 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5gdgj" event={"ID":"d582dc3e-7510-42be-aa3a-1d15b35c327c","Type":"ContainerStarted","Data":"4007d87bec9e8cbbb2f96fba62cbd83fc23bd3a2fceb0d216c6cc21d5756432d"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.531381 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39b41087-226b-4f73-9fc4-64616b430f2d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.532398 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" event={"ID":"37972326-b1df-484f-ab10-9c595b145d8c","Type":"ContainerStarted","Data":"150a5762d679342e63f55b6c6648f93ca6c3f2fda48ad4b9322b94704a226cc2"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.533167 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" event={"ID":"2f2e3391-68a2-43a8-aba9-17e583066b03","Type":"ContainerStarted","Data":"407ff7330deb9e9b1f7b10afba7128995bbec481b413ddbaec60802a65dd70f3"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.533934 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" event={"ID":"24a452b3-94a8-4c29-8409-cb1a8dd11555","Type":"ContainerStarted","Data":"96f6b004bfae066c2c561b1c3a0e043026c1989e2006e7f31f8c168092822802"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.534078 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-registry-tls\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.535872 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" event={"ID":"ba450d5b-c962-4788-8215-d1eb12f9b314","Type":"ContainerStarted","Data":"82555483b3fc53c5b884a2ecae2f37dcfafce8a387c4121ff7f51f3a61df43ac"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.535892 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" event={"ID":"ba450d5b-c962-4788-8215-d1eb12f9b314","Type":"ContainerStarted","Data":"04e53020d08b82eb6c1040a79b1ef2d1751e0e3d0b8b6ccf20f05e84232b8159"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.540495 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" event={"ID":"0eff7ea5-251b-44de-b129-c604349d6e6c","Type":"ContainerStarted","Data":"d5905e478a83932c595496c691eb2aa9ded98c6cc8b295a165e6b198c1a83626"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.548868 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" event={"ID":"6ef8eec8-b86d-4f5a-931e-c76e11c07f94","Type":"ContainerStarted","Data":"80c24fbcd6aa7799e3ff2f38bb7fe7014267c8d3945752963f8b906a277e84a6"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.548908 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" event={"ID":"6ef8eec8-b86d-4f5a-931e-c76e11c07f94","Type":"ContainerStarted","Data":"2980d18f8ba28b033dbf5d36b160c7efad5ae1c6664cb1dba27569a3bcc37db2"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.549438 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.551322 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" event={"ID":"f15060fa-5a28-4a12-be7b-2823e921eb90","Type":"ContainerStarted","Data":"a1580a457002eb0c992197304d7aa1c99c6001d60b87a1e21dc5b0c8a7c76848"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.552108 4860 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7xnrh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.552150 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.552957 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" event={"ID":"cfa87f04-40c6-4575-b647-fb13a115b81d","Type":"ContainerStarted","Data":"2a65c276eca2a8f4efd1f7a4c2c80d5c458f8825408efa4c709865822d2e34d3"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.553001 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" event={"ID":"cfa87f04-40c6-4575-b647-fb13a115b81d","Type":"ContainerStarted","Data":"5b41337622fad4321f7bb86fe4803fd1cf32c952050c583a599de54f99215a27"} Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.556574 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hjpq\" (UniqueName: \"kubernetes.io/projected/ba2ab33e-6ecc-4eac-9aaa-256e6ff68236-kube-api-access-5hjpq\") pod \"auto-csr-approver-29566738-5cj22\" (UID: \"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236\") " pod="openshift-infra/auto-csr-approver-29566738-5cj22" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.562980 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.565314 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gvm7\" (UniqueName: \"kubernetes.io/projected/628f2025-d050-42a9-bf56-9daa0e5c001b-kube-api-access-6gvm7\") pod \"package-server-manager-789f6589d5-mcqdw\" (UID: \"628f2025-d050-42a9-bf56-9daa0e5c001b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608146 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/37b4c0fc-6a82-4f6b-85fc-233090358f9c-certs\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608193 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-socket-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608248 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608284 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1241cd05-23d3-4e5a-9130-29e7638003a9-cert\") pod \"ingress-canary-wt65f\" (UID: \"1241cd05-23d3-4e5a-9130-29e7638003a9\") " pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608322 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh75p\" (UniqueName: \"kubernetes.io/projected/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-kube-api-access-zh75p\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608381 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtll7\" (UniqueName: \"kubernetes.io/projected/37b4c0fc-6a82-4f6b-85fc-233090358f9c-kube-api-access-qtll7\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608442 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/37b4c0fc-6a82-4f6b-85fc-233090358f9c-node-bootstrap-token\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608463 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmjsg\" (UniqueName: \"kubernetes.io/projected/1241cd05-23d3-4e5a-9130-29e7638003a9-kube-api-access-gmjsg\") pod \"ingress-canary-wt65f\" (UID: \"1241cd05-23d3-4e5a-9130-29e7638003a9\") " pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608482 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwl24\" (UniqueName: \"kubernetes.io/projected/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-kube-api-access-zwl24\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608592 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-plugins-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608613 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-config-volume\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608650 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-mountpoint-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608669 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-csi-data-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608705 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-registration-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.608723 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-metrics-tls\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.609156 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqvlf\" (UniqueName: \"kubernetes.io/projected/403ca5f6-bd52-40de-88d6-5151b3202c76-kube-api-access-hqvlf\") pod \"marketplace-operator-79b997595-zhgh4\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.610388 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-mountpoint-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.610549 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-csi-data-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.611026 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-config-volume\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.611261 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-registration-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.611624 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-plugins-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.615726 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-socket-dir\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: E0320 10:58:22.616483 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.116459942 +0000 UTC m=+227.337820850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.618391 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-metrics-tls\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.619246 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/37b4c0fc-6a82-4f6b-85fc-233090358f9c-node-bootstrap-token\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.620704 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/37b4c0fc-6a82-4f6b-85fc-233090358f9c-certs\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.626938 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1241cd05-23d3-4e5a-9130-29e7638003a9-cert\") pod \"ingress-canary-wt65f\" (UID: \"1241cd05-23d3-4e5a-9130-29e7638003a9\") " pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.632217 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7mtp\" (UniqueName: \"kubernetes.io/projected/9d98ac55-cf65-4f72-805b-dd3da2742004-kube-api-access-p7mtp\") pod \"olm-operator-6b444d44fb-9ffz6\" (UID: \"9d98ac55-cf65-4f72-805b-dd3da2742004\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.663159 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnfgq\" (UniqueName: \"kubernetes.io/projected/3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3-kube-api-access-xnfgq\") pod \"machine-config-operator-74547568cd-2glmz\" (UID: \"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.673701 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n92tj\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-kube-api-access-n92tj\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.693309 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8d5eff6-150c-4314-8ebc-38b3660ce01a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jvcqp\" (UID: \"e8d5eff6-150c-4314-8ebc-38b3660ce01a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.705756 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p"] Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.710162 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:22 crc kubenswrapper[4860]: E0320 10:58:22.710401 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.210362344 +0000 UTC m=+227.431723242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.711377 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: E0320 10:58:22.712260 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.212215945 +0000 UTC m=+227.433576833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.714042 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-248wx\" (UniqueName: \"kubernetes.io/projected/34043403-110c-4547-81a4-7af1429878cd-kube-api-access-248wx\") pod \"packageserver-d55dfcdfc-vfkd4\" (UID: \"34043403-110c-4547-81a4-7af1429878cd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.718747 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.737163 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpk7d\" (UniqueName: \"kubernetes.io/projected/437c32d4-4b5f-4657-86d6-5214e3bfc01f-kube-api-access-wpk7d\") pod \"collect-profiles-29566725-d6wf9\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.744776 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.756106 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.756538 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-bound-sa-token\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.769023 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fplbn\" (UniqueName: \"kubernetes.io/projected/f5043c51-bd3f-461f-b011-a42ad38ed7d4-kube-api-access-fplbn\") pod \"service-ca-operator-777779d784-x4x44\" (UID: \"f5043c51-bd3f-461f-b011-a42ad38ed7d4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.774108 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.793232 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.805304 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-5cj22" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.809760 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6xl8q"] Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.810203 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f171fccd-40ef-44a6-941f-ef1f2f4d2c2a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v6ff7\" (UID: \"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.813081 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9q9t\" (UniqueName: \"kubernetes.io/projected/8bc351c5-b724-443e-a7e2-f4abba352cef-kube-api-access-p9q9t\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnqsw\" (UID: \"8bc351c5-b724-443e-a7e2-f4abba352cef\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.813601 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.814358 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:22 crc kubenswrapper[4860]: E0320 10:58:22.814957 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.314937573 +0000 UTC m=+227.536298471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.820282 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.875164 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmjsg\" (UniqueName: \"kubernetes.io/projected/1241cd05-23d3-4e5a-9130-29e7638003a9-kube-api-access-gmjsg\") pod \"ingress-canary-wt65f\" (UID: \"1241cd05-23d3-4e5a-9130-29e7638003a9\") " pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.889531 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwl24\" (UniqueName: \"kubernetes.io/projected/10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e-kube-api-access-zwl24\") pod \"dns-default-l2xf5\" (UID: \"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e\") " pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.899386 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh75p\" (UniqueName: \"kubernetes.io/projected/dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383-kube-api-access-zh75p\") pod \"csi-hostpathplugin-2k58g\" (UID: \"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383\") " pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.916557 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:22 crc kubenswrapper[4860]: E0320 10:58:22.918660 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.418635937 +0000 UTC m=+227.639996835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.930458 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtll7\" (UniqueName: \"kubernetes.io/projected/37b4c0fc-6a82-4f6b-85fc-233090358f9c-kube-api-access-qtll7\") pod \"machine-config-server-x4xrf\" (UID: \"37b4c0fc-6a82-4f6b-85fc-233090358f9c\") " pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:22 crc kubenswrapper[4860]: I0320 10:58:22.937652 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pc5tf"] Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.018901 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.019418 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.51939831 +0000 UTC m=+227.740759218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.019606 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.020197 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.020977 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.021250 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.023673 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.023726 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.068133 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.124888 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.125778 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.625764039 +0000 UTC m=+227.847124927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.136345 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.167276 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wt65f" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.185420 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.200801 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x4xrf" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.233383 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.233933 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.733910807 +0000 UTC m=+227.955271705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.335210 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.335829 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-s52jd"] Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.335962 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.835935155 +0000 UTC m=+228.057296053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.436345 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.436835 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:23.936814311 +0000 UTC m=+228.158175209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.545370 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.546855 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.046827861 +0000 UTC m=+228.268188759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.574522 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-76g8f" podStartSLOduration=155.574491471 podStartE2EDuration="2m35.574491471s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:23.573183864 +0000 UTC m=+227.794544762" watchObservedRunningTime="2026-03-20 10:58:23.574491471 +0000 UTC m=+227.795852369" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.585691 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" event={"ID":"2f2e3391-68a2-43a8-aba9-17e583066b03","Type":"ContainerStarted","Data":"d15e97a481a2866411fa098f353115fd974d34dab62e8e9911124e820ac31078"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.598148 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" event={"ID":"f15060fa-5a28-4a12-be7b-2823e921eb90","Type":"ContainerStarted","Data":"772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.598491 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.601962 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" event={"ID":"3587f3ba-577b-425a-adf5-336a8977dcc5","Type":"ContainerStarted","Data":"366c71d2561bff010f4d5dff91d7764636b34e8d53c1f0235c50a2b7eb65710b"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.616824 4860 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-nxq82 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.616909 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" podUID="f15060fa-5a28-4a12-be7b-2823e921eb90" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.624597 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" event={"ID":"37972326-b1df-484f-ab10-9c595b145d8c","Type":"ContainerStarted","Data":"8e3bcc81cdd00a709affe27df968c6c77cd3b334be5615abb138b11884ded3f7"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.626611 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" event={"ID":"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e","Type":"ContainerStarted","Data":"e69053991352de55130197ac65f86ee443a634f8c6f03ab8a7b0e76ccb89ebff"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.631502 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" event={"ID":"24a452b3-94a8-4c29-8409-cb1a8dd11555","Type":"ContainerStarted","Data":"7f86054b01bd27d3967146848a350f1aec6609eb38c33d089881ea7c7eef77f3"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.632645 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x4xrf" event={"ID":"37b4c0fc-6a82-4f6b-85fc-233090358f9c","Type":"ContainerStarted","Data":"3bfc86a783d03b8665695deb2c7a8ebbef5d8f64d921b3278190db7abb1fcc2b"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.637336 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" event={"ID":"f67156a9-f474-4d80-9789-ffbfcc9ec78b","Type":"ContainerStarted","Data":"b5749d497a851df1c8d25600c4f06807dfeb144cf1330047dd7385abcb40ccc2"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.637399 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" event={"ID":"f67156a9-f474-4d80-9789-ffbfcc9ec78b","Type":"ContainerStarted","Data":"ea92a36da340039fb1ce36999b468209253b73004a017fa10428756510dbd011"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.646788 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" event={"ID":"c9dab77c-3c60-4c91-8c0a-31791124462d","Type":"ContainerStarted","Data":"728f7b7c66988558b1d883323799eddbefdffa3dccc36021b7936ebb56e4353c"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.647067 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.647976 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.147955754 +0000 UTC m=+228.369316642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.659403 4860 generic.go:334] "Generic (PLEG): container finished" podID="0eff7ea5-251b-44de-b129-c604349d6e6c" containerID="bf29f5e88646849ccbf1ef41a523006e5d0e1517aafc4bfd201c3385a8c598bb" exitCode=0 Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.659548 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" event={"ID":"0eff7ea5-251b-44de-b129-c604349d6e6c","Type":"ContainerDied","Data":"bf29f5e88646849ccbf1ef41a523006e5d0e1517aafc4bfd201c3385a8c598bb"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.670548 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" event={"ID":"d62cba0d-d390-4638-aa42-59631e4bf118","Type":"ContainerStarted","Data":"44721ad3dbc20a124e5ccbe9787803f6ec0cc51a6aecb26e2a19230ed22c5b87"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.670610 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" event={"ID":"d62cba0d-d390-4638-aa42-59631e4bf118","Type":"ContainerStarted","Data":"c4c3728cb0ff75517d60e0661b517c4cc605807cf199fa6e242c96682e23cf56"} Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.671371 4860 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7xnrh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.671467 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.737289 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-hfxcc" podStartSLOduration=155.737256468 podStartE2EDuration="2m35.737256468s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:23.724917975 +0000 UTC m=+227.946278863" watchObservedRunningTime="2026-03-20 10:58:23.737256468 +0000 UTC m=+227.958617366" Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.749994 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.753300 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.253284564 +0000 UTC m=+228.474645472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.860989 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.861483 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.361449463 +0000 UTC m=+228.582810361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.868291 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.868753 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.368734836 +0000 UTC m=+228.590095734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.978152 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.978510 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.478453067 +0000 UTC m=+228.699813965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:23 crc kubenswrapper[4860]: I0320 10:58:23.978617 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:23 crc kubenswrapper[4860]: E0320 10:58:23.979425 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.479414014 +0000 UTC m=+228.700774912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.086060 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:24 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:24 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:24 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.086817 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.088063 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.088566 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.588530809 +0000 UTC m=+228.809891707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.158727 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-5gdgj" podStartSLOduration=156.158696111 podStartE2EDuration="2m36.158696111s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:24.158567948 +0000 UTC m=+228.379928866" watchObservedRunningTime="2026-03-20 10:58:24.158696111 +0000 UTC m=+228.380057009" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.189995 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.190449 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.690429654 +0000 UTC m=+228.911790552 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.297544 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.297956 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.797937504 +0000 UTC m=+229.019298402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.399487 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.399961 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:24.899940392 +0000 UTC m=+229.121301290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.428442 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-sqrz5"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.464292 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" podStartSLOduration=156.464271641 podStartE2EDuration="2m36.464271641s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:24.463008396 +0000 UTC m=+228.684369294" watchObservedRunningTime="2026-03-20 10:58:24.464271641 +0000 UTC m=+228.685632539" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.501083 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.501736 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.001694122 +0000 UTC m=+229.223055030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.510853 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.527406 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jffj8"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.529853 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5pq7k" podStartSLOduration=156.529817374 podStartE2EDuration="2m36.529817374s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:24.501504187 +0000 UTC m=+228.722865085" watchObservedRunningTime="2026-03-20 10:58:24.529817374 +0000 UTC m=+228.751178282" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.537804 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vpp2k"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.541367 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vk2rn"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.584439 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.603491 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.603963 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.103947286 +0000 UTC m=+229.325308184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.671403 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" podStartSLOduration=155.671380642 podStartE2EDuration="2m35.671380642s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:24.628349485 +0000 UTC m=+228.849710403" watchObservedRunningTime="2026-03-20 10:58:24.671380642 +0000 UTC m=+228.892741540" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.673634 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7xnrh"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.715157 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.715843 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.215825709 +0000 UTC m=+229.437186607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.716926 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4429p" podStartSLOduration=156.716902478 podStartE2EDuration="2m36.716902478s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:24.716169468 +0000 UTC m=+228.937530366" watchObservedRunningTime="2026-03-20 10:58:24.716902478 +0000 UTC m=+228.938263366" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.744742 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" event={"ID":"7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e","Type":"ContainerStarted","Data":"bf6e780cda4d0e96b97c12d1f930b8d9a9052663180824d5d07b084774f4f428"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.746010 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.747328 4860 patch_prober.go:28] interesting pod/console-operator-58897d9998-pc5tf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.747384 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" podUID="7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.751755 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.789134 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.791137 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" event={"ID":"2f2e3391-68a2-43a8-aba9-17e583066b03","Type":"ContainerStarted","Data":"d6080f61cdb25b20af439b102bbb43cd441921ae90ae93f1c70d77b6ed385819"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.814074 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" event={"ID":"24a452b3-94a8-4c29-8409-cb1a8dd11555","Type":"ContainerStarted","Data":"993427f66306f40b17690f6f8ef81c6fa79154ebe2368ca11c2a070064244bc5"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.817541 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.819388 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.319368619 +0000 UTC m=+229.540729517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.850626 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x4xrf" event={"ID":"37b4c0fc-6a82-4f6b-85fc-233090358f9c","Type":"ContainerStarted","Data":"8351d681bf6b9870f3f400882880d24a4cf069dce42ced86209a6c13b4ce520f"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.872536 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" event={"ID":"3587f3ba-577b-425a-adf5-336a8977dcc5","Type":"ContainerStarted","Data":"7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.873748 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.899428 4860 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-srz5x container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.899522 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.908603 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" event={"ID":"37972326-b1df-484f-ab10-9c595b145d8c","Type":"ContainerStarted","Data":"420a5dc4ad075a9c34e1a84426f4130ed1e36ffc575249e97d527a7c520af8f2"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.911749 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.918879 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:24 crc kubenswrapper[4860]: E0320 10:58:24.920847 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.420825171 +0000 UTC m=+229.642186069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.943353 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.946876 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.948028 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.948965 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sqrz5" event={"ID":"e8ca532e-b0d7-494c-886f-bff0c8009707","Type":"ContainerStarted","Data":"72e1e1c0612e639b5d9b1dd93371fee28768245c503b21f6343128336d8f4145"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.966942 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" podStartSLOduration=156.966919973 podStartE2EDuration="2m36.966919973s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:24.946899826 +0000 UTC m=+229.168260724" watchObservedRunningTime="2026-03-20 10:58:24.966919973 +0000 UTC m=+229.188280871" Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.967191 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l2xf5"] Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.985192 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" event={"ID":"d4ce1856-395a-4003-9642-61da7cbdd789","Type":"ContainerStarted","Data":"645536bd2ef9c23dcee61a179ce48f4b51cec83f50bb54a946b11237357fa0e2"} Mar 20 10:58:24 crc kubenswrapper[4860]: I0320 10:58:24.985367 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" event={"ID":"d4ce1856-395a-4003-9642-61da7cbdd789","Type":"ContainerStarted","Data":"c2e08f4783ce6285a82d3ffbfb2feb47cf7a513b1f9869344d1f6adeade4ec65"} Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.001317 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-5cj22"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.011382 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.013212 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhgh4"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.014079 4860 generic.go:334] "Generic (PLEG): container finished" podID="f67156a9-f474-4d80-9789-ffbfcc9ec78b" containerID="b5749d497a851df1c8d25600c4f06807dfeb144cf1330047dd7385abcb40ccc2" exitCode=0 Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.016578 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" event={"ID":"f67156a9-f474-4d80-9789-ffbfcc9ec78b","Type":"ContainerDied","Data":"b5749d497a851df1c8d25600c4f06807dfeb144cf1330047dd7385abcb40ccc2"} Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.043810 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" event={"ID":"f67156a9-f474-4d80-9789-ffbfcc9ec78b","Type":"ContainerStarted","Data":"a7ed0d3f65c76368cac45c0ff7b006b217e10b81eb81c7216943753d9981573c"} Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.043853 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.025390 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.525359329 +0000 UTC m=+229.746720227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.024932 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.036076 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:25 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:25 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:25 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.044120 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.058086 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9ktqw"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.061097 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.063062 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" event={"ID":"c9dab77c-3c60-4c91-8c0a-31791124462d","Type":"ContainerStarted","Data":"6212a5d7099084ae132439cdd3e54e16ca1b64ce00eb7e8d3db188b218442702"} Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.063843 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerName="controller-manager" containerID="cri-o://80c24fbcd6aa7799e3ff2f38bb7fe7014267c8d3945752963f8b906a277e84a6" gracePeriod=30 Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.068117 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.069062 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2k58g"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.094202 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hz7zj" podStartSLOduration=157.094154472 podStartE2EDuration="2m37.094154472s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:25.049876301 +0000 UTC m=+229.271237199" watchObservedRunningTime="2026-03-20 10:58:25.094154472 +0000 UTC m=+229.315515370" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.113090 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lcdbx" podStartSLOduration=156.113061378 podStartE2EDuration="2m36.113061378s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:25.098647147 +0000 UTC m=+229.320008065" watchObservedRunningTime="2026-03-20 10:58:25.113061378 +0000 UTC m=+229.334422276" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.115937 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-45vfv"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.132453 4860 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7xnrh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": read tcp 10.217.0.2:42172->10.217.0.7:8443: read: connection reset by peer" start-of-body= Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.132560 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": read tcp 10.217.0.2:42172->10.217.0.7:8443: read: connection reset by peer" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.140664 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pr8t8" podStartSLOduration=156.140579864 podStartE2EDuration="2m36.140579864s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:25.129479955 +0000 UTC m=+229.350840873" watchObservedRunningTime="2026-03-20 10:58:25.140579864 +0000 UTC m=+229.361940792" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.152477 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.152834 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.652644609 +0000 UTC m=+229.874005507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.153056 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.159481 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.659453629 +0000 UTC m=+229.880814537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.207722 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.227548 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.246931 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.252769 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.262390 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.263539 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.763510713 +0000 UTC m=+229.984871621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.287917 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.288748 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x4x44"] Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.289736 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" podStartSLOduration=157.289720092 podStartE2EDuration="2m37.289720092s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:25.207593658 +0000 UTC m=+229.428954576" watchObservedRunningTime="2026-03-20 10:58:25.289720092 +0000 UTC m=+229.511080990" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.293986 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-x4xrf" podStartSLOduration=6.293978841 podStartE2EDuration="6.293978841s" podCreationTimestamp="2026-03-20 10:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:25.247130157 +0000 UTC m=+229.468491055" watchObservedRunningTime="2026-03-20 10:58:25.293978841 +0000 UTC m=+229.515339739" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.295281 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" podStartSLOduration=157.295276277 podStartE2EDuration="2m37.295276277s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:25.287632554 +0000 UTC m=+229.508993452" watchObservedRunningTime="2026-03-20 10:58:25.295276277 +0000 UTC m=+229.516637175" Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.296010 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wt65f"] Mar 20 10:58:25 crc kubenswrapper[4860]: W0320 10:58:25.330568 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod403ca5f6_bd52_40de_88d6_5151b3202c76.slice/crio-2383508dcb35c927549b146a858bc10ffc0e92b071146282263459310c1e4a93 WatchSource:0}: Error finding container 2383508dcb35c927549b146a858bc10ffc0e92b071146282263459310c1e4a93: Status 404 returned error can't find the container with id 2383508dcb35c927549b146a858bc10ffc0e92b071146282263459310c1e4a93 Mar 20 10:58:25 crc kubenswrapper[4860]: W0320 10:58:25.337714 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod825c6b77_c03a_463c_b9a4_d26a1ac398f0.slice/crio-c602627305bcc67508e306d123ef0f2def758ac6be365554271d4574f4292d93 WatchSource:0}: Error finding container c602627305bcc67508e306d123ef0f2def758ac6be365554271d4574f4292d93: Status 404 returned error can't find the container with id c602627305bcc67508e306d123ef0f2def758ac6be365554271d4574f4292d93 Mar 20 10:58:25 crc kubenswrapper[4860]: W0320 10:58:25.338754 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod628f2025_d050_42a9_bf56_9daa0e5c001b.slice/crio-b0677e3f5760dedf3c22df48e7121a44fa069a668de16580ffd7f952129c4950 WatchSource:0}: Error finding container b0677e3f5760dedf3c22df48e7121a44fa069a668de16580ffd7f952129c4950: Status 404 returned error can't find the container with id b0677e3f5760dedf3c22df48e7121a44fa069a668de16580ffd7f952129c4950 Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.366173 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.366698 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.866678483 +0000 UTC m=+230.088039391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: W0320 10:58:25.417676 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d98ac55_cf65_4f72_805b_dd3da2742004.slice/crio-311fba8b08b109183afdc96b1c3152adff88a2bff256e16014ab0e9b6c0499f4 WatchSource:0}: Error finding container 311fba8b08b109183afdc96b1c3152adff88a2bff256e16014ab0e9b6c0499f4: Status 404 returned error can't find the container with id 311fba8b08b109183afdc96b1c3152adff88a2bff256e16014ab0e9b6c0499f4 Mar 20 10:58:25 crc kubenswrapper[4860]: W0320 10:58:25.418599 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8d5eff6_150c_4314_8ebc_38b3660ce01a.slice/crio-3b2687d51d2b5da138dd6c49db1115dcb5b1d49ced219c3bfc2da338f96e7921 WatchSource:0}: Error finding container 3b2687d51d2b5da138dd6c49db1115dcb5b1d49ced219c3bfc2da338f96e7921: Status 404 returned error can't find the container with id 3b2687d51d2b5da138dd6c49db1115dcb5b1d49ced219c3bfc2da338f96e7921 Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.467109 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.467478 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:25.967461176 +0000 UTC m=+230.188822074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.570774 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.571437 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.071423658 +0000 UTC m=+230.292784556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.672140 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.672609 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.172591502 +0000 UTC m=+230.393952400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.773743 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.774193 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.274176718 +0000 UTC m=+230.495537616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.877383 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.877570 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.377536653 +0000 UTC m=+230.598897561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.882777 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.883508 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.383458488 +0000 UTC m=+230.604819566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.984277 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.984454 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.484411986 +0000 UTC m=+230.705772884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:25 crc kubenswrapper[4860]: I0320 10:58:25.985003 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:25 crc kubenswrapper[4860]: E0320 10:58:25.985378 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.485363282 +0000 UTC m=+230.706724180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.027042 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:26 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:26 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:26 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.027113 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.086391 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.087021 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.586995479 +0000 UTC m=+230.808356387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.102893 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" event={"ID":"437c32d4-4b5f-4657-86d6-5214e3bfc01f","Type":"ContainerStarted","Data":"07d327bc1bb178b575c3169b4eaad76591b2a789fd3236207f1f2278827c3306"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.106008 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wt65f" event={"ID":"1241cd05-23d3-4e5a-9130-29e7638003a9","Type":"ContainerStarted","Data":"469c0252f0fca726d532459416225c858cce28407a228aecb8dbe49aaa2ec784"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.110019 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" event={"ID":"8bc351c5-b724-443e-a7e2-f4abba352cef","Type":"ContainerStarted","Data":"5ad7e44b418ec559bc4d49c59ea906e7ba235765030137c43979015514a1ad12"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.114873 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" event={"ID":"8b0b480d-ae68-4b26-b9f8-6b3caef70971","Type":"ContainerStarted","Data":"414fbe8b6cfca314e3587fdb3aad4f95463437a4d6cb42d157c9e579cbdf1913"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.114950 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" event={"ID":"8b0b480d-ae68-4b26-b9f8-6b3caef70971","Type":"ContainerStarted","Data":"b20f594348279afb8642d5a7ff43240cbd15f7eb43febdb9945c4732413e3e3f"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.135285 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" event={"ID":"d4ce1856-395a-4003-9642-61da7cbdd789","Type":"ContainerStarted","Data":"6428b0277a7a5be9ac441d1fc30231736c58ad72e7183e9a00e9f736ef50f66b"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.144273 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jffj8" podStartSLOduration=158.144242491 podStartE2EDuration="2m38.144242491s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.142805571 +0000 UTC m=+230.364166489" watchObservedRunningTime="2026-03-20 10:58:26.144242491 +0000 UTC m=+230.365603389" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.162706 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" event={"ID":"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a","Type":"ContainerStarted","Data":"cf9fafd2774029771c6ee72f7fe27ae302a8ed4ae12276c15a8e533c4a4c9ef5"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.169932 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sqrz5" event={"ID":"e8ca532e-b0d7-494c-886f-bff0c8009707","Type":"ContainerStarted","Data":"589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.183208 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-s52jd" podStartSLOduration=157.183181594 podStartE2EDuration="2m37.183181594s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.177857746 +0000 UTC m=+230.399218644" watchObservedRunningTime="2026-03-20 10:58:26.183181594 +0000 UTC m=+230.404542492" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.189316 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.190208 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" event={"ID":"e8d5eff6-150c-4314-8ebc-38b3660ce01a","Type":"ContainerStarted","Data":"3b2687d51d2b5da138dd6c49db1115dcb5b1d49ced219c3bfc2da338f96e7921"} Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.191279 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.691256259 +0000 UTC m=+230.912617347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.201798 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" event={"ID":"b7c6fefc-e60e-423d-ad15-2e16173ae01b","Type":"ContainerStarted","Data":"403212a771580bd8fc898ce5ac315f5a5cd01c0d5a1df90ecb9c00b992c1d0b0"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.201856 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" event={"ID":"b7c6fefc-e60e-423d-ad15-2e16173ae01b","Type":"ContainerStarted","Data":"32a5532bfab4a262174bf80ed87400cfb109e79cecd3a1c5897723d7a5608d4b"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.208321 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" event={"ID":"b2a9fedf-d226-4388-8432-b22efd3b74bb","Type":"ContainerStarted","Data":"a26cc8b3d618a3b1bf4af57d50283a08c621d22a73263150bbca7713001da4fd"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.239202 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" event={"ID":"34043403-110c-4547-81a4-7af1429878cd","Type":"ContainerStarted","Data":"d84af7f21702b51abdfa0256281c3d2a386cc5e95a78e3ab30c2ed9df712c356"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.243336 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" event={"ID":"403ca5f6-bd52-40de-88d6-5151b3202c76","Type":"ContainerStarted","Data":"2383508dcb35c927549b146a858bc10ffc0e92b071146282263459310c1e4a93"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.259510 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-5cj22" event={"ID":"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236","Type":"ContainerStarted","Data":"ea1a7118d7d4729065b9248b97584b35507102283df7254e36c0c2abc1c111d1"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.263070 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" event={"ID":"d09bb09c-7ad0-4971-b6a2-1b37bff617b5","Type":"ContainerStarted","Data":"e80b807de4a3049e5057d7d45493e0587164dea0c8c1edd9689b188800436526"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.263128 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" event={"ID":"d09bb09c-7ad0-4971-b6a2-1b37bff617b5","Type":"ContainerStarted","Data":"1be8f50e5be893519c62ef1a0fe4717331250880581423c364caac3b6d6e6db5"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.273939 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" event={"ID":"8fe93f79-239c-4b6a-bd22-bbdf55aff0af","Type":"ContainerStarted","Data":"10182e252a57c4dee35dd79f35e563d449506e424f564e1d6f77f6f936a8157f"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.281980 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.286092 4860 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-gf5nr container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.286947 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" podUID="8fe93f79-239c-4b6a-bd22-bbdf55aff0af" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.293564 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.296657 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.796617849 +0000 UTC m=+231.017978747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.317036 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" event={"ID":"1414be44-7a88-4f16-9653-51a5793bd729","Type":"ContainerStarted","Data":"53052c3063c156aedf31adf1af52de60aeafbc6346da82a71c98e0b304e2fb37"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.322379 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-sqrz5" podStartSLOduration=158.322354475 podStartE2EDuration="2m38.322354475s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.219079253 +0000 UTC m=+230.440440171" watchObservedRunningTime="2026-03-20 10:58:26.322354475 +0000 UTC m=+230.543715373" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.323328 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" podStartSLOduration=157.323320442 podStartE2EDuration="2m37.323320442s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.312534162 +0000 UTC m=+230.533895090" watchObservedRunningTime="2026-03-20 10:58:26.323320442 +0000 UTC m=+230.544681340" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.334815 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" event={"ID":"9d98ac55-cf65-4f72-805b-dd3da2742004","Type":"ContainerStarted","Data":"311fba8b08b109183afdc96b1c3152adff88a2bff256e16014ab0e9b6c0499f4"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.385566 4860 generic.go:334] "Generic (PLEG): container finished" podID="a8f2eaf6-3749-4695-8df1-5972598c8ac6" containerID="abb213a6d9940d2db7762d80a9868c4056074390a627a64a947a21d157bba659" exitCode=0 Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.385819 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" event={"ID":"a8f2eaf6-3749-4695-8df1-5972598c8ac6","Type":"ContainerDied","Data":"abb213a6d9940d2db7762d80a9868c4056074390a627a64a947a21d157bba659"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.385905 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" event={"ID":"a8f2eaf6-3749-4695-8df1-5972598c8ac6","Type":"ContainerStarted","Data":"50bed181877a9344dfc66eb7fa4304ed83502e60d2e2274f0ed4146f68a4d2cf"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.402477 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.403462 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:26.903213974 +0000 UTC m=+231.124574872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.447387 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" event={"ID":"f1a219c6-74dc-4511-867e-cf2fce301cad","Type":"ContainerStarted","Data":"6cb01048ec791d0ae2ded19a0df5345b6d32b405d6fac7dec1f065ca6e38ec54"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.447775 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" event={"ID":"f1a219c6-74dc-4511-867e-cf2fce301cad","Type":"ContainerStarted","Data":"5d6c4e62cf466546c0a82273a8982d2bba36570338fc813f660a5b88b97fa102"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.474198 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" event={"ID":"628f2025-d050-42a9-bf56-9daa0e5c001b","Type":"ContainerStarted","Data":"b0677e3f5760dedf3c22df48e7121a44fa069a668de16580ffd7f952129c4950"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.483976 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wv5fd" podStartSLOduration=157.48394643 podStartE2EDuration="2m37.48394643s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.473728446 +0000 UTC m=+230.695089364" watchObservedRunningTime="2026-03-20 10:58:26.48394643 +0000 UTC m=+230.705307328" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.492296 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.493800 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" event={"ID":"c9dab77c-3c60-4c91-8c0a-31791124462d","Type":"ContainerStarted","Data":"17a7c8b3edce8ec8ad08fac4b00ce0baebdf9ba0dbcbb8fc7ee67ec4d729c98c"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.496382 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l2xf5" event={"ID":"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e","Type":"ContainerStarted","Data":"df633c8b9afba10b54e1f937581b5c2eff7f9fa11d295ff0623574ab82c94099"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.501414 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" event={"ID":"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3","Type":"ContainerStarted","Data":"226188a14343e1cc22dbbf45965258ca3dd238d77f3f2111ca764af7dbf3d8d7"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.504113 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.505209 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.005189111 +0000 UTC m=+231.226549999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.548658 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" event={"ID":"0eff7ea5-251b-44de-b129-c604349d6e6c","Type":"ContainerStarted","Data":"e28d29a9ed40d0a5cdf5cae2c59f8688f31f4b2d19141e3c1080df3b1880da1f"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.552552 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9ffd4b47b-9qh65"] Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.552893 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerName="controller-manager" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.552924 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerName="controller-manager" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.553055 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerName="controller-manager" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.553659 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.589141 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-6xl8q" podStartSLOduration=158.589115856 podStartE2EDuration="2m38.589115856s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.589078065 +0000 UTC m=+230.810438963" watchObservedRunningTime="2026-03-20 10:58:26.589115856 +0000 UTC m=+230.810476754" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.607257 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-proxy-ca-bundles\") pod \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.607346 4860 generic.go:334] "Generic (PLEG): container finished" podID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" containerID="80c24fbcd6aa7799e3ff2f38bb7fe7014267c8d3945752963f8b906a277e84a6" exitCode=0 Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.607536 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.607531 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7xnrh" event={"ID":"6ef8eec8-b86d-4f5a-931e-c76e11c07f94","Type":"ContainerDied","Data":"80c24fbcd6aa7799e3ff2f38bb7fe7014267c8d3945752963f8b906a277e84a6"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.607599 4860 scope.go:117] "RemoveContainer" containerID="80c24fbcd6aa7799e3ff2f38bb7fe7014267c8d3945752963f8b906a277e84a6" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.608423 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6ef8eec8-b86d-4f5a-931e-c76e11c07f94" (UID: "6ef8eec8-b86d-4f5a-931e-c76e11c07f94"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.610551 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7g2j\" (UniqueName: \"kubernetes.io/projected/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-kube-api-access-x7g2j\") pod \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.610740 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-config\") pod \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.610777 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-client-ca\") pod \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.610820 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-serving-cert\") pod \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\" (UID: \"6ef8eec8-b86d-4f5a-931e-c76e11c07f94\") " Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.610948 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9ffd4b47b-9qh65"] Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611114 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-client-ca\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611244 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdqss\" (UniqueName: \"kubernetes.io/projected/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-kube-api-access-bdqss\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611290 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611326 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-proxy-ca-bundles\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611391 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-serving-cert\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611515 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-config\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611509 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-config" (OuterVolumeSpecName: "config") pod "6ef8eec8-b86d-4f5a-931e-c76e11c07f94" (UID: "6ef8eec8-b86d-4f5a-931e-c76e11c07f94"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.611999 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-client-ca" (OuterVolumeSpecName: "client-ca") pod "6ef8eec8-b86d-4f5a-931e-c76e11c07f94" (UID: "6ef8eec8-b86d-4f5a-931e-c76e11c07f94"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.615557 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.616371 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.116347583 +0000 UTC m=+231.337708481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.624513 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-kube-api-access-x7g2j" (OuterVolumeSpecName: "kube-api-access-x7g2j") pod "6ef8eec8-b86d-4f5a-931e-c76e11c07f94" (UID: "6ef8eec8-b86d-4f5a-931e-c76e11c07f94"). InnerVolumeSpecName "kube-api-access-x7g2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.636935 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6ef8eec8-b86d-4f5a-931e-c76e11c07f94" (UID: "6ef8eec8-b86d-4f5a-931e-c76e11c07f94"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.694376 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" event={"ID":"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383","Type":"ContainerStarted","Data":"fef511d5e2c4298672a21a0b504a82b2f3a7318dc0d3e67b676816aca13424f3"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.706052 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" podStartSLOduration=157.705999507 podStartE2EDuration="2m37.705999507s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.697943353 +0000 UTC m=+230.919304271" watchObservedRunningTime="2026-03-20 10:58:26.705999507 +0000 UTC m=+230.927360425" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.718770 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.718949 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.218918986 +0000 UTC m=+231.440279884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719036 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-client-ca\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719094 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdqss\" (UniqueName: \"kubernetes.io/projected/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-kube-api-access-bdqss\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719128 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719154 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-proxy-ca-bundles\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719186 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-serving-cert\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719312 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-config\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719379 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7g2j\" (UniqueName: \"kubernetes.io/projected/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-kube-api-access-x7g2j\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719399 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719410 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.719420 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef8eec8-b86d-4f5a-931e-c76e11c07f94-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.720124 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.220102059 +0000 UTC m=+231.441462957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.720925 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-proxy-ca-bundles\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.721018 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-config\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.721573 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-client-ca\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.735576 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-serving-cert\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.752802 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-45vfv" event={"ID":"825c6b77-c03a-463c-b9a4-d26a1ac398f0","Type":"ContainerStarted","Data":"c602627305bcc67508e306d123ef0f2def758ac6be365554271d4574f4292d93"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.753654 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.771610 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdqss\" (UniqueName: \"kubernetes.io/projected/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-kube-api-access-bdqss\") pod \"controller-manager-9ffd4b47b-9qh65\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.772066 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.772118 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.773725 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" event={"ID":"efda2c60-f018-417a-a73d-2727be57b558","Type":"ContainerStarted","Data":"6776c7db4ac76edf11af1347a2ba1d42e85d0db1f3adcb8253de7bf78e663beb"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.773768 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" event={"ID":"efda2c60-f018-417a-a73d-2727be57b558","Type":"ContainerStarted","Data":"b30c8b5006161bc9927b1750dd350c7c3092a5250d3aab3e6fff675f0246db09"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.778651 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" event={"ID":"f5043c51-bd3f-461f-b011-a42ad38ed7d4","Type":"ContainerStarted","Data":"4a60f0b7f3637689f96bd20d58159e91d16d4c90136b6d4143847488cfc66c75"} Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.779874 4860 patch_prober.go:28] interesting pod/console-operator-58897d9998-pc5tf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.779915 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" podUID="7ee7ff56-a3fb-465a-9ab5-8cba6bbfcd0e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.784796 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" podUID="f15060fa-5a28-4a12-be7b-2823e921eb90" containerName="route-controller-manager" containerID="cri-o://772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df" gracePeriod=30 Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.813383 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-45vfv" podStartSLOduration=158.813351223 podStartE2EDuration="2m38.813351223s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.796728841 +0000 UTC m=+231.018089739" watchObservedRunningTime="2026-03-20 10:58:26.813351223 +0000 UTC m=+231.034712121" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.821203 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.821736 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.321707136 +0000 UTC m=+231.543068034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.869151 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" podStartSLOduration=157.869084224 podStartE2EDuration="2m37.869084224s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.841336102 +0000 UTC m=+231.062697020" watchObservedRunningTime="2026-03-20 10:58:26.869084224 +0000 UTC m=+231.090445122" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.877204 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6stjq" podStartSLOduration=157.877173018 podStartE2EDuration="2m37.877173018s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:26.873057414 +0000 UTC m=+231.094418312" watchObservedRunningTime="2026-03-20 10:58:26.877173018 +0000 UTC m=+231.098533916" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.924182 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.927625 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:26 crc kubenswrapper[4860]: E0320 10:58:26.927978 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.427961901 +0000 UTC m=+231.649322799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.941136 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.946723 4860 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-twkfs container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.946802 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" podUID="0eff7ea5-251b-44de-b129-c604349d6e6c" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.962708 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54584: no serving certificate available for the kubelet" Mar 20 10:58:26 crc kubenswrapper[4860]: I0320 10:58:26.975456 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.009863 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7xnrh"] Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.013216 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7xnrh"] Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.026368 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.026843 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.526807671 +0000 UTC m=+231.748168569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.027051 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:27 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:27 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:27 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.027099 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.079286 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54590: no serving certificate available for the kubelet" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.100941 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.131657 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.132032 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.632016947 +0000 UTC m=+231.853377845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.192195 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54600: no serving certificate available for the kubelet" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.233606 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.233989 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.733968593 +0000 UTC m=+231.955329491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.307867 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54612: no serving certificate available for the kubelet" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.336315 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.336663 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.836649039 +0000 UTC m=+232.058009937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.429512 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef8eec8-b86d-4f5a-931e-c76e11c07f94" path="/var/lib/kubelet/pods/6ef8eec8-b86d-4f5a-931e-c76e11c07f94/volumes" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.444696 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.445080 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:27.945056975 +0000 UTC m=+232.166417873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.446116 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54626: no serving certificate available for the kubelet" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.548562 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.549141 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.049097999 +0000 UTC m=+232.270458897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.585829 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.602140 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54632: no serving certificate available for the kubelet" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.651210 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.651643 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.151622511 +0000 UTC m=+232.372983409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.701357 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9ffd4b47b-9qh65"] Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.754792 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-config\") pod \"f15060fa-5a28-4a12-be7b-2823e921eb90\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.754880 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-client-ca\") pod \"f15060fa-5a28-4a12-be7b-2823e921eb90\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.754959 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d88n2\" (UniqueName: \"kubernetes.io/projected/f15060fa-5a28-4a12-be7b-2823e921eb90-kube-api-access-d88n2\") pod \"f15060fa-5a28-4a12-be7b-2823e921eb90\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.755076 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f15060fa-5a28-4a12-be7b-2823e921eb90-serving-cert\") pod \"f15060fa-5a28-4a12-be7b-2823e921eb90\" (UID: \"f15060fa-5a28-4a12-be7b-2823e921eb90\") " Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.755484 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.756116 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-config" (OuterVolumeSpecName: "config") pod "f15060fa-5a28-4a12-be7b-2823e921eb90" (UID: "f15060fa-5a28-4a12-be7b-2823e921eb90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.756148 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-client-ca" (OuterVolumeSpecName: "client-ca") pod "f15060fa-5a28-4a12-be7b-2823e921eb90" (UID: "f15060fa-5a28-4a12-be7b-2823e921eb90"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.757417 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.257394823 +0000 UTC m=+232.478755721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.772744 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f15060fa-5a28-4a12-be7b-2823e921eb90-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f15060fa-5a28-4a12-be7b-2823e921eb90" (UID: "f15060fa-5a28-4a12-be7b-2823e921eb90"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.778742 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15060fa-5a28-4a12-be7b-2823e921eb90-kube-api-access-d88n2" (OuterVolumeSpecName: "kube-api-access-d88n2") pod "f15060fa-5a28-4a12-be7b-2823e921eb90" (UID: "f15060fa-5a28-4a12-be7b-2823e921eb90"). InnerVolumeSpecName "kube-api-access-d88n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.815427 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" event={"ID":"b2a9fedf-d226-4388-8432-b22efd3b74bb","Type":"ContainerStarted","Data":"29fa7dcbcc70b6de21bd09a99263130c3ad92f555e2f4e915a5fb6190ecc268c"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.858422 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-45vfv" event={"ID":"825c6b77-c03a-463c-b9a4-d26a1ac398f0","Type":"ContainerStarted","Data":"881f32cfad3d1ae79569c4b2cd108eca09988b3ebd33507a5a8247454327f28d"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.860311 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.860729 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d88n2\" (UniqueName: \"kubernetes.io/projected/f15060fa-5a28-4a12-be7b-2823e921eb90-kube-api-access-d88n2\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.860749 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f15060fa-5a28-4a12-be7b-2823e921eb90-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.860765 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.860779 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f15060fa-5a28-4a12-be7b-2823e921eb90-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.860858 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.36083513 +0000 UTC m=+232.582196028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.862618 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.862771 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.866973 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54642: no serving certificate available for the kubelet" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.881183 4860 generic.go:334] "Generic (PLEG): container finished" podID="f15060fa-5a28-4a12-be7b-2823e921eb90" containerID="772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df" exitCode=0 Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.881321 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" event={"ID":"f15060fa-5a28-4a12-be7b-2823e921eb90","Type":"ContainerDied","Data":"772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.881362 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" event={"ID":"f15060fa-5a28-4a12-be7b-2823e921eb90","Type":"ContainerDied","Data":"a1580a457002eb0c992197304d7aa1c99c6001d60b87a1e21dc5b0c8a7c76848"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.881384 4860 scope.go:117] "RemoveContainer" containerID="772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.881440 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.897857 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wt65f" event={"ID":"1241cd05-23d3-4e5a-9130-29e7638003a9","Type":"ContainerStarted","Data":"a031a1412ae277d24d326cdbaaf60a10073ce180edac74049c4bcbc1b4ab7884"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.913410 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" event={"ID":"a8f2eaf6-3749-4695-8df1-5972598c8ac6","Type":"ContainerStarted","Data":"9452fecfc4671cbfdf74ab92361212b9bb37ce495999a676abc9da575051b245"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.915989 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" event={"ID":"b7c6fefc-e60e-423d-ad15-2e16173ae01b","Type":"ContainerStarted","Data":"9291001c0f705a7789261ff5f2e547e4ab384e5c4dae053fe45cb20f4ceb5da4"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.933735 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l2xf5" event={"ID":"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e","Type":"ContainerStarted","Data":"2dd52f924be48f54af4981b865933f6717b8def1257274a13547ce3d43e0cb71"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.933784 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l2xf5" event={"ID":"10eeabbc-0cee-4be7-a0aa-c0fe0c570e0e","Type":"ContainerStarted","Data":"2f319b8086262e609f38a6e6210967d809d5447583848911397473b5f25f8b01"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.934323 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.953868 4860 scope.go:117] "RemoveContainer" containerID="772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df" Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.959949 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df\": container with ID starting with 772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df not found: ID does not exist" containerID="772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.960012 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df"} err="failed to get container status \"772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df\": rpc error: code = NotFound desc = could not find container \"772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df\": container with ID starting with 772e9b2c3443f142ebe0053c3effb51efaae04c134aa23c92ff36573b43b35df not found: ID does not exist" Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.963201 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:27 crc kubenswrapper[4860]: E0320 10:58:27.963844 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.463826755 +0000 UTC m=+232.685187653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.985077 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" event={"ID":"628f2025-d050-42a9-bf56-9daa0e5c001b","Type":"ContainerStarted","Data":"a81aa45637bb30ca12bbd6e9b9ccc6139664b58c32f277fca2d07ecbad1b1373"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.985553 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" event={"ID":"628f2025-d050-42a9-bf56-9daa0e5c001b","Type":"ContainerStarted","Data":"5ad20e85aea3df60e61b092ce121c20cbeb6d50d9aa3330cbe520198797d07b3"} Mar 20 10:58:27 crc kubenswrapper[4860]: I0320 10:58:27.986515 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.043798 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:28 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:28 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:28 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.044986 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.051540 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" event={"ID":"d09bb09c-7ad0-4971-b6a2-1b37bff617b5","Type":"ContainerStarted","Data":"53ddd82409bd7ee46782b6056a3d3362a98c84210ccf946b74b468ffd1e7f06d"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.064361 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.065612 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.565589466 +0000 UTC m=+232.786950364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.087031 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" event={"ID":"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3","Type":"ContainerStarted","Data":"7445bc2e888e5beade54243d7a640ad66ebdd01e0dcbdbef3a62bad3d7f216f8"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.087082 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" event={"ID":"3c8d3f73-b9b3-48cc-b6a0-f882d45bc9d3","Type":"ContainerStarted","Data":"097a19b0977f7b39523604e06b5911e68d3f8c53d1497fc3a9c3b86af15dc0ed"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.115112 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82"] Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.129774 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nxq82"] Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.133271 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" event={"ID":"8bc351c5-b724-443e-a7e2-f4abba352cef","Type":"ContainerStarted","Data":"3430764001e19d4b3e2e51c6595c5ad940f0983e589f3610bbbdfe28e9d91d1d"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.166347 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.166757 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" event={"ID":"1414be44-7a88-4f16-9653-51a5793bd729","Type":"ContainerStarted","Data":"841517bb1d97b11c96a8438b06509cff1b86584b43f982f2219e934bfe3b0d80"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.166830 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" event={"ID":"1414be44-7a88-4f16-9653-51a5793bd729","Type":"ContainerStarted","Data":"33c0b9f7f0bb92315706c73f4bd0e839fc0dee58a8b9016cc9e3acd93e4a9d65"} Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.168743 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.668690314 +0000 UTC m=+232.890051212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.197962 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" event={"ID":"34043403-110c-4547-81a4-7af1429878cd","Type":"ContainerStarted","Data":"40266b1b2c1f3a20cf5bc717e097f2ddb0b6b2bc6f5f0f395b6eea3955ef2e54"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.199406 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.213063 4860 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vfkd4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" start-of-body= Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.213133 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" podUID="34043403-110c-4547-81a4-7af1429878cd" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.214038 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xmkqm" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.225238 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" event={"ID":"403ca5f6-bd52-40de-88d6-5151b3202c76","Type":"ContainerStarted","Data":"857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.226498 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.253541 4860 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zhgh4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.253618 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" event={"ID":"437c32d4-4b5f-4657-86d6-5214e3bfc01f","Type":"ContainerStarted","Data":"509a0ab6073b8f241ed054d972f10c10904777731b271c4522d9caaf55b66c8c"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.253625 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.255823 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-vk2rn" podStartSLOduration=159.255793147 podStartE2EDuration="2m39.255793147s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.239152734 +0000 UTC m=+232.460513642" watchObservedRunningTime="2026-03-20 10:58:28.255793147 +0000 UTC m=+232.477154045" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.274947 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.276389 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.776372709 +0000 UTC m=+232.997733607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.303651 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" event={"ID":"f171fccd-40ef-44a6-941f-ef1f2f4d2c2a","Type":"ContainerStarted","Data":"8ee0e229e69a6f243d3f1e6dd07c122f01aaf2d19a61572557392196999c3a13"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.330013 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" podStartSLOduration=159.32999076 podStartE2EDuration="2m39.32999076s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.327729278 +0000 UTC m=+232.549090176" watchObservedRunningTime="2026-03-20 10:58:28.32999076 +0000 UTC m=+232.551351658" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.346679 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" event={"ID":"8fe93f79-239c-4b6a-bd22-bbdf55aff0af","Type":"ContainerStarted","Data":"61502ebc31445385563013c72335cb75130e8d70987eb86e2c3acd4a5cf1b222"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.382331 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.384321 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.884306821 +0000 UTC m=+233.105667719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.382215 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gf5nr" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.406801 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" event={"ID":"e8d5eff6-150c-4314-8ebc-38b3660ce01a","Type":"ContainerStarted","Data":"b6f16e4fdc471b1f707e108886719c9840d2255ce081c81e52557545b38ad261"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.431584 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-l2xf5" podStartSLOduration=9.431557996 podStartE2EDuration="9.431557996s" podCreationTimestamp="2026-03-20 10:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.429740145 +0000 UTC m=+232.651101043" watchObservedRunningTime="2026-03-20 10:58:28.431557996 +0000 UTC m=+232.652918894" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.447625 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54646: no serving certificate available for the kubelet" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.455828 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x4x44" event={"ID":"f5043c51-bd3f-461f-b011-a42ad38ed7d4","Type":"ContainerStarted","Data":"eb01d817668ae366fca7df1e5aa6b6cd1b6d88ee7369e6ccd483226e47ecce63"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.484967 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.485877 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:28.985856206 +0000 UTC m=+233.207217104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.509525 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" event={"ID":"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0","Type":"ContainerStarted","Data":"63a0cd6f2cc2d9eeeaa1bb8e6f130de3ebcf3e0a4cb9179f998ccf85f93ed02e"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.509584 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.531449 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.555327 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" event={"ID":"9d98ac55-cf65-4f72-805b-dd3da2742004","Type":"ContainerStarted","Data":"1d0d787c7aef31b6076c19907a676d3a10ac78299df210f8e57035654ff980a3"} Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.594396 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.597437 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.097416669 +0000 UTC m=+233.318777567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.630676 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-9ktqw" podStartSLOduration=159.630652974 podStartE2EDuration="2m39.630652974s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.567711843 +0000 UTC m=+232.789072741" watchObservedRunningTime="2026-03-20 10:58:28.630652974 +0000 UTC m=+232.852013872" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.632733 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wt65f" podStartSLOduration=9.632724861 podStartE2EDuration="9.632724861s" podCreationTimestamp="2026-03-20 10:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.621726655 +0000 UTC m=+232.843087573" watchObservedRunningTime="2026-03-20 10:58:28.632724861 +0000 UTC m=+232.854085759" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.702004 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.705188 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.205158146 +0000 UTC m=+233.426519044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.805976 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-pc5tf" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.809211 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.809867 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.309841278 +0000 UTC m=+233.531202176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.825430 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnqsw" podStartSLOduration=159.82536884 podStartE2EDuration="2m39.82536884s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.712978734 +0000 UTC m=+232.934339632" watchObservedRunningTime="2026-03-20 10:58:28.82536884 +0000 UTC m=+233.046729738" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.852830 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rtgqn" podStartSLOduration=160.852803683 podStartE2EDuration="2m40.852803683s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.806808124 +0000 UTC m=+233.028169022" watchObservedRunningTime="2026-03-20 10:58:28.852803683 +0000 UTC m=+233.074164581" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.857916 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qgkd4" podStartSLOduration=160.857897195 podStartE2EDuration="2m40.857897195s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.855353634 +0000 UTC m=+233.076714552" watchObservedRunningTime="2026-03-20 10:58:28.857897195 +0000 UTC m=+233.079258093" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.910207 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" podStartSLOduration=159.910177119 podStartE2EDuration="2m39.910177119s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.90840897 +0000 UTC m=+233.129769878" watchObservedRunningTime="2026-03-20 10:58:28.910177119 +0000 UTC m=+233.131538017" Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.911021 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:28 crc kubenswrapper[4860]: E0320 10:58:28.911545 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.411523297 +0000 UTC m=+233.632884195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:28 crc kubenswrapper[4860]: I0320 10:58:28.975642 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v6ff7" podStartSLOduration=159.97561765 podStartE2EDuration="2m39.97561765s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:28.972471892 +0000 UTC m=+233.193832810" watchObservedRunningTime="2026-03-20 10:58:28.97561765 +0000 UTC m=+233.196978538" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.009155 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jvcqp" podStartSLOduration=160.009131562 podStartE2EDuration="2m40.009131562s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:29.00690497 +0000 UTC m=+233.228265868" watchObservedRunningTime="2026-03-20 10:58:29.009131562 +0000 UTC m=+233.230492460" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.015991 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.016375 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.516358913 +0000 UTC m=+233.737719811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.026573 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:29 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:29 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:29 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.026661 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.030494 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx"] Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.030747 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f15060fa-5a28-4a12-be7b-2823e921eb90" containerName="route-controller-manager" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.030770 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15060fa-5a28-4a12-be7b-2823e921eb90" containerName="route-controller-manager" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.030871 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f15060fa-5a28-4a12-be7b-2823e921eb90" containerName="route-controller-manager" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.031401 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.054933 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.054986 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.055024 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.055053 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.055159 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.078500 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.096589 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx"] Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.120202 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.120499 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.620470569 +0000 UTC m=+233.841831467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.120994 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-config\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.121259 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.121412 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7npd\" (UniqueName: \"kubernetes.io/projected/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-kube-api-access-n7npd\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.121533 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-client-ca\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.121681 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-serving-cert\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.122190 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.622173666 +0000 UTC m=+233.843534564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.163723 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" podStartSLOduration=4.163700511 podStartE2EDuration="4.163700511s" podCreationTimestamp="2026-03-20 10:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:29.159991688 +0000 UTC m=+233.381352586" watchObservedRunningTime="2026-03-20 10:58:29.163700511 +0000 UTC m=+233.385061409" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.187130 4860 ???:1] "http: TLS handshake error from 192.168.126.11:54654: no serving certificate available for the kubelet" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.196417 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2glmz" podStartSLOduration=160.196395861 podStartE2EDuration="2m40.196395861s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:29.196167654 +0000 UTC m=+233.417528562" watchObservedRunningTime="2026-03-20 10:58:29.196395861 +0000 UTC m=+233.417756759" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.224486 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.224816 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7npd\" (UniqueName: \"kubernetes.io/projected/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-kube-api-access-n7npd\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.224863 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-client-ca\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.224919 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-serving-cert\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.224945 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-config\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.226257 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-config\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.226784 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.726755715 +0000 UTC m=+233.948116773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.226841 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-client-ca\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.236395 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-serving-cert\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.257921 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" podStartSLOduration=160.257895762 podStartE2EDuration="2m40.257895762s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:29.257363877 +0000 UTC m=+233.478724785" watchObservedRunningTime="2026-03-20 10:58:29.257895762 +0000 UTC m=+233.479256660" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.278164 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7npd\" (UniqueName: \"kubernetes.io/projected/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-kube-api-access-n7npd\") pod \"route-controller-manager-5dc8897f6c-8dhfx\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.314426 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" podStartSLOduration=161.314393583 podStartE2EDuration="2m41.314393583s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:29.307581754 +0000 UTC m=+233.528942652" watchObservedRunningTime="2026-03-20 10:58:29.314393583 +0000 UTC m=+233.535754501" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.327123 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.327935 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.827910709 +0000 UTC m=+234.049271607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.359396 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" podStartSLOduration=160.359364904 podStartE2EDuration="2m40.359364904s" podCreationTimestamp="2026-03-20 10:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:29.339168252 +0000 UTC m=+233.560529160" watchObservedRunningTime="2026-03-20 10:58:29.359364904 +0000 UTC m=+233.580725802" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.396612 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.431490 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.431997 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:29.931971394 +0000 UTC m=+234.153332292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.432096 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15060fa-5a28-4a12-be7b-2823e921eb90" path="/var/lib/kubelet/pods/f15060fa-5a28-4a12-be7b-2823e921eb90/volumes" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.533405 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.533928 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.033909579 +0000 UTC m=+234.255270477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.601656 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" event={"ID":"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0","Type":"ContainerStarted","Data":"9e7a764e0c672086dc2232d2c01074598903d934f8d665fc7b2f1e55bf12ba31"} Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.610909 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" event={"ID":"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383","Type":"ContainerStarted","Data":"3ebdddedcf41b1b5a3ca26aed44754d8031631ac1336491746cb50ab65e583f6"} Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.622900 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" event={"ID":"a8f2eaf6-3749-4695-8df1-5972598c8ac6","Type":"ContainerStarted","Data":"e9f15176d1fd4501f85ecbe88e065250d07fa7387e88e17ed25ec8434312c211"} Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.628591 4860 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zhgh4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.628671 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.629963 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.630032 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.630265 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.648689 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.648824 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.148800844 +0000 UTC m=+234.370161742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.649088 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.649543 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.149533744 +0000 UTC m=+234.370894642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.702880 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ffz6" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.743309 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" podStartSLOduration=161.743283052 podStartE2EDuration="2m41.743283052s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:29.701657624 +0000 UTC m=+233.923018522" watchObservedRunningTime="2026-03-20 10:58:29.743283052 +0000 UTC m=+233.964643950" Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.751617 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.753573 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.253517197 +0000 UTC m=+234.474878085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.866853 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.867303 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.367288951 +0000 UTC m=+234.588649849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:29 crc kubenswrapper[4860]: I0320 10:58:29.976994 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:29 crc kubenswrapper[4860]: E0320 10:58:29.977647 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.477625681 +0000 UTC m=+234.698986579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.031261 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:30 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:30 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:30 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.031357 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.085366 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.085790 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.585773079 +0000 UTC m=+234.807133977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.136777 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n79b7"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.138005 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.142849 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.171639 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vfkd4" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.177773 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n79b7"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.186553 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.187050 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.687003755 +0000 UTC m=+234.908364663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.187602 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.290165 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-catalog-content\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.290248 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fks5\" (UniqueName: \"kubernetes.io/projected/d2690d8b-c7f7-4e71-af44-33444e4d6187-kube-api-access-6fks5\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.290285 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-utilities\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.290328 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.290690 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.790675369 +0000 UTC m=+235.012036267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.346563 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w5w95"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.347574 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.392218 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.392509 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-catalog-content\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.392576 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fks5\" (UniqueName: \"kubernetes.io/projected/d2690d8b-c7f7-4e71-af44-33444e4d6187-kube-api-access-6fks5\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.392626 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-utilities\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.392721 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.892677976 +0000 UTC m=+235.114038874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.392930 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.393127 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-utilities\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.393560 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-catalog-content\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.393989 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.893963382 +0000 UTC m=+235.115324440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.494652 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.495063 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.995019063 +0000 UTC m=+235.216379961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.495152 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-catalog-content\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.495282 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-utilities\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.495403 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.495509 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h2z2\" (UniqueName: \"kubernetes.io/projected/4f84f111-5991-4e78-9508-82283b8e36f7-kube-api-access-4h2z2\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.495859 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:30.995853066 +0000 UTC m=+235.217213964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.597037 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.597296 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.097252776 +0000 UTC m=+235.318613674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.597373 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-utilities\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.597456 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.597504 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h2z2\" (UniqueName: \"kubernetes.io/projected/4f84f111-5991-4e78-9508-82283b8e36f7-kube-api-access-4h2z2\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.597557 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-catalog-content\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.598615 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-catalog-content\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.598938 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.098922353 +0000 UTC m=+235.320283251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.599160 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-utilities\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.624792 4860 ???:1] "http: TLS handshake error from 192.168.126.11:42882: no serving certificate available for the kubelet" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.649303 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" event={"ID":"215a61d8-f0e1-419d-b4cb-8ddc801d5a79","Type":"ContainerStarted","Data":"07cb2faf1a3563eaa92fb2a9f50a19ad2493c7dab6c573aa3b095afac43feb29"} Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.649380 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" event={"ID":"215a61d8-f0e1-419d-b4cb-8ddc801d5a79","Type":"ContainerStarted","Data":"7aa219357098d2e5cc353f906dff76cf1a673bc6396fbbfabef96091000e9adc"} Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.650874 4860 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zhgh4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.651315 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.699325 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.699716 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.199698996 +0000 UTC m=+235.421059894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.745120 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5w95"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.752044 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fks5\" (UniqueName: \"kubernetes.io/projected/d2690d8b-c7f7-4e71-af44-33444e4d6187-kube-api-access-6fks5\") pod \"community-operators-n79b7\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.752170 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h2z2\" (UniqueName: \"kubernetes.io/projected/4f84f111-5991-4e78-9508-82283b8e36f7-kube-api-access-4h2z2\") pod \"certified-operators-w5w95\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.755533 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.762797 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.763332 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r7ckk"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.764560 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.767887 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r7ckk"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.801923 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.802383 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.302364622 +0000 UTC m=+235.523725520 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.823328 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p6sh9"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.824461 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.854876 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p6sh9"] Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.914897 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.915161 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-catalog-content\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.915209 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77rcr\" (UniqueName: \"kubernetes.io/projected/7b622f82-e01c-42b8-8061-16b6e8f551fb-kube-api-access-77rcr\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.915258 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-utilities\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.915284 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-utilities\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.915302 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-catalog-content\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.915366 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhz7s\" (UniqueName: \"kubernetes.io/projected/f0e14a08-824b-450f-bf98-2a476da0d44b-kube-api-access-jhz7s\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:30 crc kubenswrapper[4860]: E0320 10:58:30.915506 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.415484658 +0000 UTC m=+235.636845566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.934457 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" podStartSLOduration=5.934433086 podStartE2EDuration="5.934433086s" podCreationTimestamp="2026-03-20 10:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:30.901439068 +0000 UTC m=+235.122799966" watchObservedRunningTime="2026-03-20 10:58:30.934433086 +0000 UTC m=+235.155793984" Mar 20 10:58:30 crc kubenswrapper[4860]: I0320 10:58:30.982127 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.017414 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhz7s\" (UniqueName: \"kubernetes.io/projected/f0e14a08-824b-450f-bf98-2a476da0d44b-kube-api-access-jhz7s\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.017502 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.017550 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-catalog-content\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.017586 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77rcr\" (UniqueName: \"kubernetes.io/projected/7b622f82-e01c-42b8-8061-16b6e8f551fb-kube-api-access-77rcr\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.017635 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-utilities\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.017658 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-utilities\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.017676 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-catalog-content\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.021948 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-catalog-content\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.021943 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-utilities\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.022764 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.522745982 +0000 UTC m=+235.744106880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.032883 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-utilities\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.033548 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-catalog-content\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.039652 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:31 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:31 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:31 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.039727 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.055332 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77rcr\" (UniqueName: \"kubernetes.io/projected/7b622f82-e01c-42b8-8061-16b6e8f551fb-kube-api-access-77rcr\") pod \"certified-operators-p6sh9\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.066334 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhz7s\" (UniqueName: \"kubernetes.io/projected/f0e14a08-824b-450f-bf98-2a476da0d44b-kube-api-access-jhz7s\") pod \"community-operators-r7ckk\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.118981 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.119862 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.619844183 +0000 UTC m=+235.841205081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.162069 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.179701 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.222121 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.222566 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.72254957 +0000 UTC m=+235.943910468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.322934 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.323352 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.823334523 +0000 UTC m=+236.044695421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.426757 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.427122 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:31.92710795 +0000 UTC m=+236.148468848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.540650 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.540857 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.040819033 +0000 UTC m=+236.262179931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.541804 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.542328 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.042314105 +0000 UTC m=+236.263675003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.645055 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.645579 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.145538946 +0000 UTC m=+236.366899844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.645695 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.646273 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.146265386 +0000 UTC m=+236.367626284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.713965 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5w95"] Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.716302 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" event={"ID":"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383","Type":"ContainerStarted","Data":"ef24e9f8d4b6a43cc833f7f0d60bbe74efda0e84841e788eaabadc9fd54fdd4f"} Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.716601 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.750123 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.750763 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.250733922 +0000 UTC m=+236.472094820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.764915 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n79b7"] Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.823673 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r7ckk"] Mar 20 10:58:31 crc kubenswrapper[4860]: W0320 10:58:31.826678 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2690d8b_c7f7_4e71_af44_33444e4d6187.slice/crio-d3e5a4f45fcca5a9f1ea6868d200bf518732a2225ce154b3a8c6e2fe9edbe0fc WatchSource:0}: Error finding container d3e5a4f45fcca5a9f1ea6868d200bf518732a2225ce154b3a8c6e2fe9edbe0fc: Status 404 returned error can't find the container with id d3e5a4f45fcca5a9f1ea6868d200bf518732a2225ce154b3a8c6e2fe9edbe0fc Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.854027 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.855148 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.355130236 +0000 UTC m=+236.576491134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.902638 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.955495 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:31 crc kubenswrapper[4860]: E0320 10:58:31.955915 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.455890419 +0000 UTC m=+236.677251317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.965536 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:31 crc kubenswrapper[4860]: I0320 10:58:31.996158 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-twkfs" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.011865 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p6sh9"] Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.023305 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.030434 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:32 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:32 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:32 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.030485 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.071029 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.071468 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.571445843 +0000 UTC m=+236.792806741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.120656 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xlp"] Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.121824 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.130831 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.143428 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xlp"] Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.174038 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.174363 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.674329975 +0000 UTC m=+236.895690873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.174417 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-catalog-content\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.174461 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-utilities\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.174515 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg46p\" (UniqueName: \"kubernetes.io/projected/f20cb95e-5480-4c9c-859f-0b03d679ab06-kube-api-access-kg46p\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.174632 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.175050 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.675033575 +0000 UTC m=+236.896394473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.277900 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.283156 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-catalog-content\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.283336 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-utilities\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.283451 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg46p\" (UniqueName: \"kubernetes.io/projected/f20cb95e-5480-4c9c-859f-0b03d679ab06-kube-api-access-kg46p\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.285307 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-catalog-content\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.285496 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.785457036 +0000 UTC m=+237.006817934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.286110 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-utilities\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.286664 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.286710 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.356076 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg46p\" (UniqueName: \"kubernetes.io/projected/f20cb95e-5480-4c9c-859f-0b03d679ab06-kube-api-access-kg46p\") pod \"redhat-marketplace-d9xlp\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.366915 4860 patch_prober.go:28] interesting pod/apiserver-76f77b778f-vpp2k container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]log ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]etcd ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/max-in-flight-filter ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 20 10:58:32 crc kubenswrapper[4860]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 20 10:58:32 crc kubenswrapper[4860]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/project.openshift.io-projectcache ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/openshift.io-startinformers ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 20 10:58:32 crc kubenswrapper[4860]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 10:58:32 crc kubenswrapper[4860]: livez check failed Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.367006 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" podUID="a8f2eaf6-3749-4695-8df1-5972598c8ac6" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.367468 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.368915 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.385258 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.386122 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.886099866 +0000 UTC m=+237.107460764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.388015 4860 patch_prober.go:28] interesting pod/console-f9d7485db-sqrz5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.388064 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-sqrz5" podUID="e8ca532e-b0d7-494c-886f-bff0c8009707" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.388530 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.388553 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.388649 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.388729 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.460244 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.477903 4860 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.490021 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.490575 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:32.990549161 +0000 UTC m=+237.211910069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.550323 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5jpww"] Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.551668 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.585240 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jpww"] Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.596335 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.596392 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rln5j\" (UniqueName: \"kubernetes.io/projected/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-kube-api-access-rln5j\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.596437 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-utilities\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.596513 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-catalog-content\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.596873 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:33.096860188 +0000 UTC m=+237.318221086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dbgm" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.698459 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.698729 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-catalog-content\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.698781 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rln5j\" (UniqueName: \"kubernetes.io/projected/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-kube-api-access-rln5j\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.698818 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-utilities\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.699382 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-utilities\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: E0320 10:58:32.699753 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:33.19973344 +0000 UTC m=+237.421094338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.699764 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-catalog-content\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.720610 4860 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T10:58:32.47793414Z","Handler":null,"Name":""} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.731988 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rln5j\" (UniqueName: \"kubernetes.io/projected/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-kube-api-access-rln5j\") pod \"redhat-marketplace-5jpww\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.733293 4860 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.733342 4860 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.760045 4860 generic.go:334] "Generic (PLEG): container finished" podID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerID="fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d" exitCode=0 Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.760199 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79b7" event={"ID":"d2690d8b-c7f7-4e71-af44-33444e4d6187","Type":"ContainerDied","Data":"fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.760293 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79b7" event={"ID":"d2690d8b-c7f7-4e71-af44-33444e4d6187","Type":"ContainerStarted","Data":"d3e5a4f45fcca5a9f1ea6868d200bf518732a2225ce154b3a8c6e2fe9edbe0fc"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.781344 4860 generic.go:334] "Generic (PLEG): container finished" podID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerID="e24cde6154df13c72246e903afddf246ae7bde629e68f46946db4ded716f4fbf" exitCode=0 Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.781754 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6sh9" event={"ID":"7b622f82-e01c-42b8-8061-16b6e8f551fb","Type":"ContainerDied","Data":"e24cde6154df13c72246e903afddf246ae7bde629e68f46946db4ded716f4fbf"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.781848 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6sh9" event={"ID":"7b622f82-e01c-42b8-8061-16b6e8f551fb","Type":"ContainerStarted","Data":"a0b846fa7e38edf968a2ecba37cb073cb9550be5be8e0d139e448911ef2ef8dd"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.805176 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.826213 4860 generic.go:334] "Generic (PLEG): container finished" podID="437c32d4-4b5f-4657-86d6-5214e3bfc01f" containerID="509a0ab6073b8f241ed054d972f10c10904777731b271c4522d9caaf55b66c8c" exitCode=0 Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.826364 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" event={"ID":"437c32d4-4b5f-4657-86d6-5214e3bfc01f","Type":"ContainerDied","Data":"509a0ab6073b8f241ed054d972f10c10904777731b271c4522d9caaf55b66c8c"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.836369 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.848984 4860 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.849045 4860 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.866215 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.921715 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" event={"ID":"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383","Type":"ContainerStarted","Data":"7115075fbd3ae4e5afa0a20202db6ef747ec3fe98f0eb9b25e8cd4d053503781"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.938738 4860 generic.go:334] "Generic (PLEG): container finished" podID="4f84f111-5991-4e78-9508-82283b8e36f7" containerID="6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe" exitCode=0 Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.938840 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5w95" event={"ID":"4f84f111-5991-4e78-9508-82283b8e36f7","Type":"ContainerDied","Data":"6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.938875 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5w95" event={"ID":"4f84f111-5991-4e78-9508-82283b8e36f7","Type":"ContainerStarted","Data":"1f538c1360593e9a410b70b066b34c33f5665e2dac735a2212ce3b3dbdf2dce0"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.967217 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dbgm\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.967828 4860 generic.go:334] "Generic (PLEG): container finished" podID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerID="19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2" exitCode=0 Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.973776 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7ckk" event={"ID":"f0e14a08-824b-450f-bf98-2a476da0d44b","Type":"ContainerDied","Data":"19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2"} Mar 20 10:58:32 crc kubenswrapper[4860]: I0320 10:58:32.973974 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7ckk" event={"ID":"f0e14a08-824b-450f-bf98-2a476da0d44b","Type":"ContainerStarted","Data":"711ef831caa70569060ba2dc068e9cede6a21ca93c6a666bf7abd4f4e2156736"} Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.011708 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.033838 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:33 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:33 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:33 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.033921 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.042115 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xlp"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.062023 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.205574 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.251424 4860 ???:1] "http: TLS handshake error from 192.168.126.11:42898: no serving certificate available for the kubelet" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.314601 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qq8bh"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.317529 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.320914 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.381020 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qq8bh"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.421110 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-catalog-content\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.421171 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-utilities\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.421207 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdzkg\" (UniqueName: \"kubernetes.io/projected/514f05c3-1404-46c6-9f4d-68437ea8ee0b-kube-api-access-pdzkg\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.444078 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.459484 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jpww"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.523089 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-utilities\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.523330 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdzkg\" (UniqueName: \"kubernetes.io/projected/514f05c3-1404-46c6-9f4d-68437ea8ee0b-kube-api-access-pdzkg\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.523600 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-catalog-content\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.525624 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-utilities\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.525833 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-catalog-content\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.552489 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdzkg\" (UniqueName: \"kubernetes.io/projected/514f05c3-1404-46c6-9f4d-68437ea8ee0b-kube-api-access-pdzkg\") pod \"redhat-operators-qq8bh\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.600667 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.601655 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.605311 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.606136 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.611212 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.625562 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7f16bf2-db43-4057-9961-ef03202f7828-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.625631 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7f16bf2-db43-4057-9961-ef03202f7828-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.660065 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.712443 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jx27x"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.716773 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.731121 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jx27x"] Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.733797 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7f16bf2-db43-4057-9961-ef03202f7828-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.733893 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7f16bf2-db43-4057-9961-ef03202f7828-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.733940 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn9d7\" (UniqueName: \"kubernetes.io/projected/f81a43aa-2c39-4d49-8526-f097322dd7bf-kube-api-access-mn9d7\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.734064 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-utilities\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.734134 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-catalog-content\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.734274 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7f16bf2-db43-4057-9961-ef03202f7828-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.755076 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7f16bf2-db43-4057-9961-ef03202f7828-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.794200 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dbgm"] Mar 20 10:58:33 crc kubenswrapper[4860]: W0320 10:58:33.801778 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39b41087_226b_4f73_9fc4_64616b430f2d.slice/crio-e1dc651025a62d4a8c6173da89e1be13654b3ccc8586f46471cbc5846256700b WatchSource:0}: Error finding container e1dc651025a62d4a8c6173da89e1be13654b3ccc8586f46471cbc5846256700b: Status 404 returned error can't find the container with id e1dc651025a62d4a8c6173da89e1be13654b3ccc8586f46471cbc5846256700b Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.835979 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-catalog-content\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.836132 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn9d7\" (UniqueName: \"kubernetes.io/projected/f81a43aa-2c39-4d49-8526-f097322dd7bf-kube-api-access-mn9d7\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.836243 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-utilities\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.837102 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-catalog-content\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.837444 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-utilities\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.865173 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn9d7\" (UniqueName: \"kubernetes.io/projected/f81a43aa-2c39-4d49-8526-f097322dd7bf-kube-api-access-mn9d7\") pod \"redhat-operators-jx27x\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.946441 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.975357 4860 generic.go:334] "Generic (PLEG): container finished" podID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerID="3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac" exitCode=0 Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.975551 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xlp" event={"ID":"f20cb95e-5480-4c9c-859f-0b03d679ab06","Type":"ContainerDied","Data":"3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac"} Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.977265 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xlp" event={"ID":"f20cb95e-5480-4c9c-859f-0b03d679ab06","Type":"ContainerStarted","Data":"39591daa264ba7bebe5fdc529015addd733110c1c54ed6b98d8a162a754e8d60"} Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.984088 4860 generic.go:334] "Generic (PLEG): container finished" podID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerID="5ed610d57137030afeaeb124289fb2f5072934d814423d8d1fd76ae4e4bbd772" exitCode=0 Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.984272 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpww" event={"ID":"2268b7ae-c1db-4ef4-8236-60f7cfa277a1","Type":"ContainerDied","Data":"5ed610d57137030afeaeb124289fb2f5072934d814423d8d1fd76ae4e4bbd772"} Mar 20 10:58:33 crc kubenswrapper[4860]: I0320 10:58:33.984329 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpww" event={"ID":"2268b7ae-c1db-4ef4-8236-60f7cfa277a1","Type":"ContainerStarted","Data":"5f7c8a0760c1233f7a673fb0037c3446fe619e19acbc1953810d2c42b3db815b"} Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.013189 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.016803 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.019887 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.019990 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.022940 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qq8bh"] Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.022993 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" event={"ID":"dd4cc0ff-db1a-4f4f-9fe6-6692a18b4383","Type":"ContainerStarted","Data":"6ae55a6ee512bab12f957da5d1b95423a1a9d0446f4cc36c58cf532984c034ba"} Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.024759 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" event={"ID":"39b41087-226b-4f73-9fc4-64616b430f2d","Type":"ContainerStarted","Data":"e1dc651025a62d4a8c6173da89e1be13654b3ccc8586f46471cbc5846256700b"} Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.027293 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.032391 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:34 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:34 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:34 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.032480 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.042083 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.042750 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.078660 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.116077 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-2k58g" podStartSLOduration=15.116046746 podStartE2EDuration="15.116046746s" podCreationTimestamp="2026-03-20 10:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:34.108181307 +0000 UTC m=+238.329542215" watchObservedRunningTime="2026-03-20 10:58:34.116046746 +0000 UTC m=+238.337407644" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.150063 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.150365 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.151820 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.185492 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.264891 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.349330 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.465978 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.557026 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpk7d\" (UniqueName: \"kubernetes.io/projected/437c32d4-4b5f-4657-86d6-5214e3bfc01f-kube-api-access-wpk7d\") pod \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.557140 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/437c32d4-4b5f-4657-86d6-5214e3bfc01f-config-volume\") pod \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.557308 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/437c32d4-4b5f-4657-86d6-5214e3bfc01f-secret-volume\") pod \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\" (UID: \"437c32d4-4b5f-4657-86d6-5214e3bfc01f\") " Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.558472 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/437c32d4-4b5f-4657-86d6-5214e3bfc01f-config-volume" (OuterVolumeSpecName: "config-volume") pod "437c32d4-4b5f-4657-86d6-5214e3bfc01f" (UID: "437c32d4-4b5f-4657-86d6-5214e3bfc01f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.561294 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jx27x"] Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.565438 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437c32d4-4b5f-4657-86d6-5214e3bfc01f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "437c32d4-4b5f-4657-86d6-5214e3bfc01f" (UID: "437c32d4-4b5f-4657-86d6-5214e3bfc01f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.565626 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/437c32d4-4b5f-4657-86d6-5214e3bfc01f-kube-api-access-wpk7d" (OuterVolumeSpecName: "kube-api-access-wpk7d") pod "437c32d4-4b5f-4657-86d6-5214e3bfc01f" (UID: "437c32d4-4b5f-4657-86d6-5214e3bfc01f"). InnerVolumeSpecName "kube-api-access-wpk7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:34 crc kubenswrapper[4860]: W0320 10:58:34.630870 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf81a43aa_2c39_4d49_8526_f097322dd7bf.slice/crio-b681f56bdfe5d7107d46e455eb32b402c90f63f71364660357f8ecd488fda604 WatchSource:0}: Error finding container b681f56bdfe5d7107d46e455eb32b402c90f63f71364660357f8ecd488fda604: Status 404 returned error can't find the container with id b681f56bdfe5d7107d46e455eb32b402c90f63f71364660357f8ecd488fda604 Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.660046 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpk7d\" (UniqueName: \"kubernetes.io/projected/437c32d4-4b5f-4657-86d6-5214e3bfc01f-kube-api-access-wpk7d\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.660160 4860 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/437c32d4-4b5f-4657-86d6-5214e3bfc01f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.660175 4860 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/437c32d4-4b5f-4657-86d6-5214e3bfc01f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:34 crc kubenswrapper[4860]: I0320 10:58:34.707899 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 10:58:34 crc kubenswrapper[4860]: W0320 10:58:34.835535 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podee04bfe1_80c1_43ea_9c2f_a8dde5f81388.slice/crio-da7b08ed5546a0ff8c7a39bfbda760782f55c7c1cfa7a2816badce41f2f14d8f WatchSource:0}: Error finding container da7b08ed5546a0ff8c7a39bfbda760782f55c7c1cfa7a2816badce41f2f14d8f: Status 404 returned error can't find the container with id da7b08ed5546a0ff8c7a39bfbda760782f55c7c1cfa7a2816badce41f2f14d8f Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.026642 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:35 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:35 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:35 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.026700 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.045125 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" event={"ID":"39b41087-226b-4f73-9fc4-64616b430f2d","Type":"ContainerStarted","Data":"f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.047578 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.049030 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c7f16bf2-db43-4057-9961-ef03202f7828","Type":"ContainerStarted","Data":"72fea729710975c82c871f9d3fed185003fd3dc264665ec3bf550bc85ad152a0"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.049077 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c7f16bf2-db43-4057-9961-ef03202f7828","Type":"ContainerStarted","Data":"b4581e8a8e22d21944cf78b64c955a6fce94adca574c93dd2c425e73eb876cb3"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.057667 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388","Type":"ContainerStarted","Data":"da7b08ed5546a0ff8c7a39bfbda760782f55c7c1cfa7a2816badce41f2f14d8f"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.061149 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.061149 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9" event={"ID":"437c32d4-4b5f-4657-86d6-5214e3bfc01f","Type":"ContainerDied","Data":"07d327bc1bb178b575c3169b4eaad76591b2a789fd3236207f1f2278827c3306"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.061250 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07d327bc1bb178b575c3169b4eaad76591b2a789fd3236207f1f2278827c3306" Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.064262 4860 generic.go:334] "Generic (PLEG): container finished" podID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerID="62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847" exitCode=0 Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.064358 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8bh" event={"ID":"514f05c3-1404-46c6-9f4d-68437ea8ee0b","Type":"ContainerDied","Data":"62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.064404 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8bh" event={"ID":"514f05c3-1404-46c6-9f4d-68437ea8ee0b","Type":"ContainerStarted","Data":"db09468d977aabd81ce312da99eaa8c50b25e5282affd310a612fbfda038e94c"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.077397 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerStarted","Data":"cc28a9c4b1f826fc06b2b83281cd0a01bf1dc28b3e9617ab722a34ea90577dc6"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.077446 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerStarted","Data":"b681f56bdfe5d7107d46e455eb32b402c90f63f71364660357f8ecd488fda604"} Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.079698 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" podStartSLOduration=167.079670921 podStartE2EDuration="2m47.079670921s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:35.072619655 +0000 UTC m=+239.293980553" watchObservedRunningTime="2026-03-20 10:58:35.079670921 +0000 UTC m=+239.301031819" Mar 20 10:58:35 crc kubenswrapper[4860]: I0320 10:58:35.122446 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.122415 podStartE2EDuration="2.122415s" podCreationTimestamp="2026-03-20 10:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:35.121031871 +0000 UTC m=+239.342392789" watchObservedRunningTime="2026-03-20 10:58:35.122415 +0000 UTC m=+239.343775898" Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.025608 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:36 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:36 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:36 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.025699 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.103917 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388","Type":"ContainerStarted","Data":"58095badac555e1dbfbe198d782954bf2ce2eba6cbdea6f1e244cd2ee4bb9ae8"} Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.108526 4860 generic.go:334] "Generic (PLEG): container finished" podID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerID="cc28a9c4b1f826fc06b2b83281cd0a01bf1dc28b3e9617ab722a34ea90577dc6" exitCode=0 Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.108595 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerDied","Data":"cc28a9c4b1f826fc06b2b83281cd0a01bf1dc28b3e9617ab722a34ea90577dc6"} Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.124121 4860 generic.go:334] "Generic (PLEG): container finished" podID="c7f16bf2-db43-4057-9961-ef03202f7828" containerID="72fea729710975c82c871f9d3fed185003fd3dc264665ec3bf550bc85ad152a0" exitCode=0 Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.124736 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c7f16bf2-db43-4057-9961-ef03202f7828","Type":"ContainerDied","Data":"72fea729710975c82c871f9d3fed185003fd3dc264665ec3bf550bc85ad152a0"} Mar 20 10:58:36 crc kubenswrapper[4860]: I0320 10:58:36.173592 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.1735413980000002 podStartE2EDuration="3.173541398s" podCreationTimestamp="2026-03-20 10:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:36.135320615 +0000 UTC m=+240.356681523" watchObservedRunningTime="2026-03-20 10:58:36.173541398 +0000 UTC m=+240.394902296" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.025378 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:37 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:37 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:37 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.025459 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.153739 4860 generic.go:334] "Generic (PLEG): container finished" podID="ee04bfe1-80c1-43ea-9c2f-a8dde5f81388" containerID="58095badac555e1dbfbe198d782954bf2ce2eba6cbdea6f1e244cd2ee4bb9ae8" exitCode=0 Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.156119 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388","Type":"ContainerDied","Data":"58095badac555e1dbfbe198d782954bf2ce2eba6cbdea6f1e244cd2ee4bb9ae8"} Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.288883 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.299945 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-vpp2k" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.703411 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.866094 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7f16bf2-db43-4057-9961-ef03202f7828-kube-api-access\") pod \"c7f16bf2-db43-4057-9961-ef03202f7828\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.866175 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7f16bf2-db43-4057-9961-ef03202f7828-kubelet-dir\") pod \"c7f16bf2-db43-4057-9961-ef03202f7828\" (UID: \"c7f16bf2-db43-4057-9961-ef03202f7828\") " Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.866724 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7f16bf2-db43-4057-9961-ef03202f7828-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c7f16bf2-db43-4057-9961-ef03202f7828" (UID: "c7f16bf2-db43-4057-9961-ef03202f7828"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.884572 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7f16bf2-db43-4057-9961-ef03202f7828-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c7f16bf2-db43-4057-9961-ef03202f7828" (UID: "c7f16bf2-db43-4057-9961-ef03202f7828"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.968243 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7f16bf2-db43-4057-9961-ef03202f7828-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:37 crc kubenswrapper[4860]: I0320 10:58:37.968784 4860 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7f16bf2-db43-4057-9961-ef03202f7828-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.030744 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:38 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:38 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:38 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.030824 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.184231 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.184256 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c7f16bf2-db43-4057-9961-ef03202f7828","Type":"ContainerDied","Data":"b4581e8a8e22d21944cf78b64c955a6fce94adca574c93dd2c425e73eb876cb3"} Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.184335 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4581e8a8e22d21944cf78b64c955a6fce94adca574c93dd2c425e73eb876cb3" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.190295 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-l2xf5" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.415319 4860 ???:1] "http: TLS handshake error from 192.168.126.11:42910: no serving certificate available for the kubelet" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.517095 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.692302 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kubelet-dir\") pod \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.692436 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kube-api-access\") pod \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\" (UID: \"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388\") " Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.692448 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ee04bfe1-80c1-43ea-9c2f-a8dde5f81388" (UID: "ee04bfe1-80c1-43ea-9c2f-a8dde5f81388"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.692900 4860 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.699519 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ee04bfe1-80c1-43ea-9c2f-a8dde5f81388" (UID: "ee04bfe1-80c1-43ea-9c2f-a8dde5f81388"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:38 crc kubenswrapper[4860]: I0320 10:58:38.793977 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee04bfe1-80c1-43ea-9c2f-a8dde5f81388-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:39 crc kubenswrapper[4860]: I0320 10:58:39.025411 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:39 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:39 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:39 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:39 crc kubenswrapper[4860]: I0320 10:58:39.025510 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:39 crc kubenswrapper[4860]: I0320 10:58:39.227611 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ee04bfe1-80c1-43ea-9c2f-a8dde5f81388","Type":"ContainerDied","Data":"da7b08ed5546a0ff8c7a39bfbda760782f55c7c1cfa7a2816badce41f2f14d8f"} Mar 20 10:58:39 crc kubenswrapper[4860]: I0320 10:58:39.227669 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da7b08ed5546a0ff8c7a39bfbda760782f55c7c1cfa7a2816badce41f2f14d8f" Mar 20 10:58:39 crc kubenswrapper[4860]: I0320 10:58:39.227728 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:40 crc kubenswrapper[4860]: I0320 10:58:40.023537 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:40 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:40 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:40 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:40 crc kubenswrapper[4860]: I0320 10:58:40.023611 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:40 crc kubenswrapper[4860]: I0320 10:58:40.421018 4860 ???:1] "http: TLS handshake error from 192.168.126.11:38984: no serving certificate available for the kubelet" Mar 20 10:58:41 crc kubenswrapper[4860]: I0320 10:58:41.026531 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:41 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:41 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:41 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:41 crc kubenswrapper[4860]: I0320 10:58:41.026608 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.023770 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:42 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:42 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:42 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.024104 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.363033 4860 patch_prober.go:28] interesting pod/console-f9d7485db-sqrz5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.363102 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-sqrz5" podUID="e8ca532e-b0d7-494c-886f-bff0c8009707" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.390908 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.390953 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.390978 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:42 crc kubenswrapper[4860]: I0320 10:58:42.391009 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:43 crc kubenswrapper[4860]: I0320 10:58:43.024275 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:43 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:43 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:43 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:43 crc kubenswrapper[4860]: I0320 10:58:43.024404 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:43 crc kubenswrapper[4860]: I0320 10:58:43.183645 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:43 crc kubenswrapper[4860]: I0320 10:58:43.185363 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 10:58:43 crc kubenswrapper[4860]: I0320 10:58:43.204929 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035f0b3d-92ee-4564-8dad-28b231e1c800-metrics-certs\") pod \"network-metrics-daemon-q85gq\" (UID: \"035f0b3d-92ee-4564-8dad-28b231e1c800\") " pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:43 crc kubenswrapper[4860]: I0320 10:58:43.439208 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 10:58:43 crc kubenswrapper[4860]: I0320 10:58:43.446270 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q85gq" Mar 20 10:58:44 crc kubenswrapper[4860]: I0320 10:58:44.024567 4860 patch_prober.go:28] interesting pod/router-default-5444994796-5gdgj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:44 crc kubenswrapper[4860]: [-]has-synced failed: reason withheld Mar 20 10:58:44 crc kubenswrapper[4860]: [+]process-running ok Mar 20 10:58:44 crc kubenswrapper[4860]: healthz check failed Mar 20 10:58:44 crc kubenswrapper[4860]: I0320 10:58:44.024644 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5gdgj" podUID="d582dc3e-7510-42be-aa3a-1d15b35c327c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:44 crc kubenswrapper[4860]: I0320 10:58:44.167108 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9ffd4b47b-9qh65"] Mar 20 10:58:44 crc kubenswrapper[4860]: I0320 10:58:44.167394 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" podUID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" containerName="controller-manager" containerID="cri-o://9e7a764e0c672086dc2232d2c01074598903d934f8d665fc7b2f1e55bf12ba31" gracePeriod=30 Mar 20 10:58:44 crc kubenswrapper[4860]: I0320 10:58:44.196751 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx"] Mar 20 10:58:44 crc kubenswrapper[4860]: I0320 10:58:44.197010 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" podUID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" containerName="route-controller-manager" containerID="cri-o://07cb2faf1a3563eaa92fb2a9f50a19ad2493c7dab6c573aa3b095afac43feb29" gracePeriod=30 Mar 20 10:58:45 crc kubenswrapper[4860]: I0320 10:58:45.024731 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:45 crc kubenswrapper[4860]: I0320 10:58:45.028676 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-5gdgj" Mar 20 10:58:45 crc kubenswrapper[4860]: I0320 10:58:45.305588 4860 generic.go:334] "Generic (PLEG): container finished" podID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" containerID="9e7a764e0c672086dc2232d2c01074598903d934f8d665fc7b2f1e55bf12ba31" exitCode=0 Mar 20 10:58:45 crc kubenswrapper[4860]: I0320 10:58:45.305701 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" event={"ID":"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0","Type":"ContainerDied","Data":"9e7a764e0c672086dc2232d2c01074598903d934f8d665fc7b2f1e55bf12ba31"} Mar 20 10:58:45 crc kubenswrapper[4860]: I0320 10:58:45.309373 4860 generic.go:334] "Generic (PLEG): container finished" podID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" containerID="07cb2faf1a3563eaa92fb2a9f50a19ad2493c7dab6c573aa3b095afac43feb29" exitCode=0 Mar 20 10:58:45 crc kubenswrapper[4860]: I0320 10:58:45.309433 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" event={"ID":"215a61d8-f0e1-419d-b4cb-8ddc801d5a79","Type":"ContainerDied","Data":"07cb2faf1a3563eaa92fb2a9f50a19ad2493c7dab6c573aa3b095afac43feb29"} Mar 20 10:58:46 crc kubenswrapper[4860]: I0320 10:58:46.977536 4860 patch_prober.go:28] interesting pod/controller-manager-9ffd4b47b-9qh65 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 20 10:58:46 crc kubenswrapper[4860]: I0320 10:58:46.977638 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" podUID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 20 10:58:48 crc kubenswrapper[4860]: I0320 10:58:48.681698 4860 ???:1] "http: TLS handshake error from 192.168.126.11:38990: no serving certificate available for the kubelet" Mar 20 10:58:49 crc kubenswrapper[4860]: I0320 10:58:49.397699 4860 patch_prober.go:28] interesting pod/route-controller-manager-5dc8897f6c-8dhfx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Mar 20 10:58:49 crc kubenswrapper[4860]: I0320 10:58:49.397787 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" podUID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.344928 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.345026 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.388885 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.389346 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.388885 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.389419 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.389457 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.389937 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.390008 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.390339 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"881f32cfad3d1ae79569c4b2cd108eca09988b3ebd33507a5a8247454327f28d"} pod="openshift-console/downloads-7954f5f757-45vfv" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.390404 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" containerID="cri-o://881f32cfad3d1ae79569c4b2cd108eca09988b3ebd33507a5a8247454327f28d" gracePeriod=2 Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.627495 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:52 crc kubenswrapper[4860]: I0320 10:58:52.633937 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 10:58:53 crc kubenswrapper[4860]: I0320 10:58:53.210653 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 10:58:53 crc kubenswrapper[4860]: I0320 10:58:53.374186 4860 generic.go:334] "Generic (PLEG): container finished" podID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerID="881f32cfad3d1ae79569c4b2cd108eca09988b3ebd33507a5a8247454327f28d" exitCode=0 Mar 20 10:58:53 crc kubenswrapper[4860]: I0320 10:58:53.374291 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-45vfv" event={"ID":"825c6b77-c03a-463c-b9a4-d26a1ac398f0","Type":"ContainerDied","Data":"881f32cfad3d1ae79569c4b2cd108eca09988b3ebd33507a5a8247454327f28d"} Mar 20 10:58:57 crc kubenswrapper[4860]: I0320 10:58:57.978061 4860 patch_prober.go:28] interesting pod/controller-manager-9ffd4b47b-9qh65 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:58:57 crc kubenswrapper[4860]: I0320 10:58:57.978662 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" podUID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:58:59 crc kubenswrapper[4860]: I0320 10:58:59.397921 4860 patch_prober.go:28] interesting pod/route-controller-manager-5dc8897f6c-8dhfx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Mar 20 10:58:59 crc kubenswrapper[4860]: I0320 10:58:59.398064 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" podUID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.397177 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.399000 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.632706 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.677824 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-589db55d97-b7n5d"] Mar 20 10:59:02 crc kubenswrapper[4860]: E0320 10:59:02.678175 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f16bf2-db43-4057-9961-ef03202f7828" containerName="pruner" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678191 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f16bf2-db43-4057-9961-ef03202f7828" containerName="pruner" Mar 20 10:59:02 crc kubenswrapper[4860]: E0320 10:59:02.678217 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437c32d4-4b5f-4657-86d6-5214e3bfc01f" containerName="collect-profiles" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678259 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="437c32d4-4b5f-4657-86d6-5214e3bfc01f" containerName="collect-profiles" Mar 20 10:59:02 crc kubenswrapper[4860]: E0320 10:59:02.678282 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee04bfe1-80c1-43ea-9c2f-a8dde5f81388" containerName="pruner" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678291 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee04bfe1-80c1-43ea-9c2f-a8dde5f81388" containerName="pruner" Mar 20 10:59:02 crc kubenswrapper[4860]: E0320 10:59:02.678299 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" containerName="controller-manager" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678307 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" containerName="controller-manager" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678641 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" containerName="controller-manager" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678659 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7f16bf2-db43-4057-9961-ef03202f7828" containerName="pruner" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678668 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="437c32d4-4b5f-4657-86d6-5214e3bfc01f" containerName="collect-profiles" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.678678 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee04bfe1-80c1-43ea-9c2f-a8dde5f81388" containerName="pruner" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.679181 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.692032 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-589db55d97-b7n5d"] Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.763553 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mcqdw" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.771937 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-config\") pod \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.772052 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdqss\" (UniqueName: \"kubernetes.io/projected/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-kube-api-access-bdqss\") pod \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.772075 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-serving-cert\") pod \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.772104 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-client-ca\") pod \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.772176 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-proxy-ca-bundles\") pod \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\" (UID: \"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0\") " Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.773017 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-client-ca" (OuterVolumeSpecName: "client-ca") pod "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" (UID: "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.773467 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" (UID: "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.773975 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmctq\" (UniqueName: \"kubernetes.io/projected/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-kube-api-access-fmctq\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774091 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-config\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774132 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-config" (OuterVolumeSpecName: "config") pod "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" (UID: "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774182 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-serving-cert\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774295 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-proxy-ca-bundles\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774333 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-client-ca\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774429 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774443 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.774454 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.790995 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-kube-api-access-bdqss" (OuterVolumeSpecName: "kube-api-access-bdqss") pod "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" (UID: "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0"). InnerVolumeSpecName "kube-api-access-bdqss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.791602 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" (UID: "2af6ac3e-b4b2-400b-bfd2-0142f07dabd0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.876113 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-client-ca\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.876208 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmctq\" (UniqueName: \"kubernetes.io/projected/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-kube-api-access-fmctq\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.876274 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-config\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.876337 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-serving-cert\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.876371 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-proxy-ca-bundles\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.876427 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdqss\" (UniqueName: \"kubernetes.io/projected/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-kube-api-access-bdqss\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.876445 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.877351 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-client-ca\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.877568 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-proxy-ca-bundles\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.878043 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-config\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.886045 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-serving-cert\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:02 crc kubenswrapper[4860]: I0320 10:59:02.895203 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmctq\" (UniqueName: \"kubernetes.io/projected/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-kube-api-access-fmctq\") pod \"controller-manager-589db55d97-b7n5d\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.007863 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.255064 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.385332 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-client-ca\") pod \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.385398 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-serving-cert\") pod \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.385439 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-config\") pod \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.385555 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7npd\" (UniqueName: \"kubernetes.io/projected/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-kube-api-access-n7npd\") pod \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\" (UID: \"215a61d8-f0e1-419d-b4cb-8ddc801d5a79\") " Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.386691 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-config" (OuterVolumeSpecName: "config") pod "215a61d8-f0e1-419d-b4cb-8ddc801d5a79" (UID: "215a61d8-f0e1-419d-b4cb-8ddc801d5a79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.386778 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-client-ca" (OuterVolumeSpecName: "client-ca") pod "215a61d8-f0e1-419d-b4cb-8ddc801d5a79" (UID: "215a61d8-f0e1-419d-b4cb-8ddc801d5a79"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.390503 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "215a61d8-f0e1-419d-b4cb-8ddc801d5a79" (UID: "215a61d8-f0e1-419d-b4cb-8ddc801d5a79"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.394629 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-kube-api-access-n7npd" (OuterVolumeSpecName: "kube-api-access-n7npd") pod "215a61d8-f0e1-419d-b4cb-8ddc801d5a79" (UID: "215a61d8-f0e1-419d-b4cb-8ddc801d5a79"). InnerVolumeSpecName "kube-api-access-n7npd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.451866 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" event={"ID":"215a61d8-f0e1-419d-b4cb-8ddc801d5a79","Type":"ContainerDied","Data":"7aa219357098d2e5cc353f906dff76cf1a673bc6396fbbfabef96091000e9adc"} Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.451945 4860 scope.go:117] "RemoveContainer" containerID="07cb2faf1a3563eaa92fb2a9f50a19ad2493c7dab6c573aa3b095afac43feb29" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.452060 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.456717 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" event={"ID":"2af6ac3e-b4b2-400b-bfd2-0142f07dabd0","Type":"ContainerDied","Data":"63a0cd6f2cc2d9eeeaa1bb8e6f130de3ebcf3e0a4cb9179f998ccf85f93ed02e"} Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.456829 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ffd4b47b-9qh65" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.479157 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx"] Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.483053 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc8897f6c-8dhfx"] Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.489029 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7npd\" (UniqueName: \"kubernetes.io/projected/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-kube-api-access-n7npd\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.489059 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.489071 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.489080 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215a61d8-f0e1-419d-b4cb-8ddc801d5a79-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.496660 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9ffd4b47b-9qh65"] Mar 20 10:59:03 crc kubenswrapper[4860]: I0320 10:59:03.499799 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9ffd4b47b-9qh65"] Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.134291 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-589db55d97-b7n5d"] Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.268625 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht"] Mar 20 10:59:04 crc kubenswrapper[4860]: E0320 10:59:04.268954 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" containerName="route-controller-manager" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.268971 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" containerName="route-controller-manager" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.269107 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" containerName="route-controller-manager" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.269707 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.272799 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.273016 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.273217 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.273565 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.284618 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.285557 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.291072 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht"] Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.404200 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-serving-cert\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.404277 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59vs6\" (UniqueName: \"kubernetes.io/projected/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-kube-api-access-59vs6\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.404497 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-client-ca\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.405163 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-config\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.506760 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-client-ca\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.506835 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-config\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.506904 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-serving-cert\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.506932 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59vs6\" (UniqueName: \"kubernetes.io/projected/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-kube-api-access-59vs6\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.508710 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-client-ca\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.509208 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-config\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.521481 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-serving-cert\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.529044 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59vs6\" (UniqueName: \"kubernetes.io/projected/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-kube-api-access-59vs6\") pod \"route-controller-manager-56668659b9-pdhht\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:04 crc kubenswrapper[4860]: E0320 10:59:04.551708 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 10:59:04 crc kubenswrapper[4860]: E0320 10:59:04.551909 4860 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:59:04 crc kubenswrapper[4860]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 10:59:04 crc kubenswrapper[4860]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5hjpq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566738-5cj22_openshift-infra(ba2ab33e-6ecc-4eac-9aaa-256e6ff68236): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 10:59:04 crc kubenswrapper[4860]: > logger="UnhandledError" Mar 20 10:59:04 crc kubenswrapper[4860]: E0320 10:59:04.553074 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566738-5cj22" podUID="ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" Mar 20 10:59:04 crc kubenswrapper[4860]: I0320 10:59:04.586096 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:05 crc kubenswrapper[4860]: I0320 10:59:05.419923 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="215a61d8-f0e1-419d-b4cb-8ddc801d5a79" path="/var/lib/kubelet/pods/215a61d8-f0e1-419d-b4cb-8ddc801d5a79/volumes" Mar 20 10:59:05 crc kubenswrapper[4860]: I0320 10:59:05.420537 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af6ac3e-b4b2-400b-bfd2-0142f07dabd0" path="/var/lib/kubelet/pods/2af6ac3e-b4b2-400b-bfd2-0142f07dabd0/volumes" Mar 20 10:59:05 crc kubenswrapper[4860]: E0320 10:59:05.478924 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566738-5cj22" podUID="ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" Mar 20 10:59:08 crc kubenswrapper[4860]: I0320 10:59:08.823378 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 10:59:08 crc kubenswrapper[4860]: I0320 10:59:08.826354 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:08 crc kubenswrapper[4860]: I0320 10:59:08.827373 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 10:59:08 crc kubenswrapper[4860]: I0320 10:59:08.831409 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 10:59:08 crc kubenswrapper[4860]: I0320 10:59:08.831485 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 10:59:08 crc kubenswrapper[4860]: I0320 10:59:08.948482 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da7fc050-0408-49bf-a97d-3b5935573dc7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:08 crc kubenswrapper[4860]: I0320 10:59:08.948627 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da7fc050-0408-49bf-a97d-3b5935573dc7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:09 crc kubenswrapper[4860]: I0320 10:59:09.049769 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da7fc050-0408-49bf-a97d-3b5935573dc7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:09 crc kubenswrapper[4860]: I0320 10:59:09.049847 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da7fc050-0408-49bf-a97d-3b5935573dc7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:09 crc kubenswrapper[4860]: I0320 10:59:09.049986 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da7fc050-0408-49bf-a97d-3b5935573dc7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:09 crc kubenswrapper[4860]: I0320 10:59:09.070001 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da7fc050-0408-49bf-a97d-3b5935573dc7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:09 crc kubenswrapper[4860]: I0320 10:59:09.150423 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:09 crc kubenswrapper[4860]: E0320 10:59:09.407474 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 10:59:09 crc kubenswrapper[4860]: E0320 10:59:09.408031 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kg46p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-d9xlp_openshift-marketplace(f20cb95e-5480-4c9c-859f-0b03d679ab06): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:09 crc kubenswrapper[4860]: E0320 10:59:09.409373 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-d9xlp" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.334955 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-d9xlp" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.432937 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.433506 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4h2z2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-w5w95_openshift-marketplace(4f84f111-5991-4e78-9508-82283b8e36f7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.434720 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-w5w95" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.479723 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.479944 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-77rcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-p6sh9_openshift-marketplace(7b622f82-e01c-42b8-8061-16b6e8f551fb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.481304 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-p6sh9" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.498262 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.498463 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rln5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5jpww_openshift-marketplace(2268b7ae-c1db-4ef4-8236-60f7cfa277a1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:11 crc kubenswrapper[4860]: E0320 10:59:11.502442 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5jpww" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" Mar 20 10:59:11 crc kubenswrapper[4860]: I0320 10:59:11.566466 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q85gq"] Mar 20 10:59:12 crc kubenswrapper[4860]: I0320 10:59:12.389257 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:59:12 crc kubenswrapper[4860]: I0320 10:59:12.389321 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:59:13 crc kubenswrapper[4860]: E0320 10:59:12.948329 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5jpww" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" Mar 20 10:59:13 crc kubenswrapper[4860]: E0320 10:59:12.951482 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-w5w95" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" Mar 20 10:59:13 crc kubenswrapper[4860]: E0320 10:59:12.951569 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-p6sh9" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" Mar 20 10:59:13 crc kubenswrapper[4860]: I0320 10:59:12.955294 4860 scope.go:117] "RemoveContainer" containerID="9e7a764e0c672086dc2232d2c01074598903d934f8d665fc7b2f1e55bf12ba31" Mar 20 10:59:13 crc kubenswrapper[4860]: E0320 10:59:13.087645 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 10:59:13 crc kubenswrapper[4860]: E0320 10:59:13.087908 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6fks5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-n79b7_openshift-marketplace(d2690d8b-c7f7-4e71-af44-33444e4d6187): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:13 crc kubenswrapper[4860]: E0320 10:59:13.089458 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-n79b7" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.207660 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.208951 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.211143 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.328030 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-var-lock\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.328568 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kube-api-access\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.328683 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.430363 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kube-api-access\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.430428 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.430454 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-var-lock\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.430532 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-var-lock\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.430597 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.456252 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kube-api-access\") pod \"installer-9-crc\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:14 crc kubenswrapper[4860]: I0320 10:59:14.541847 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.069361 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-n79b7" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" Mar 20 10:59:17 crc kubenswrapper[4860]: W0320 10:59:17.070177 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod035f0b3d_92ee_4564_8dad_28b231e1c800.slice/crio-32118c993437fb49061fcbe8f50cc8afb3dd3653401bb561c3d02a24ff85d7ff WatchSource:0}: Error finding container 32118c993437fb49061fcbe8f50cc8afb3dd3653401bb561c3d02a24ff85d7ff: Status 404 returned error can't find the container with id 32118c993437fb49061fcbe8f50cc8afb3dd3653401bb561c3d02a24ff85d7ff Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.245490 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.246142 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mn9d7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jx27x_openshift-marketplace(f81a43aa-2c39-4d49-8526-f097322dd7bf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.247958 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jx27x" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.267806 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.268058 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pdzkg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-qq8bh_openshift-marketplace(514f05c3-1404-46c6-9f4d-68437ea8ee0b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.269195 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-qq8bh" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.283138 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.283331 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jhz7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-r7ckk_openshift-marketplace(f0e14a08-824b-450f-bf98-2a476da0d44b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.285435 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-r7ckk" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.538194 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.562714 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q85gq" event={"ID":"035f0b3d-92ee-4564-8dad-28b231e1c800","Type":"ContainerStarted","Data":"554d77b0c1bd97b8a6708815bacc2eb2bc0081bc636f5dec384b6a541029bbe9"} Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.562763 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q85gq" event={"ID":"035f0b3d-92ee-4564-8dad-28b231e1c800","Type":"ContainerStarted","Data":"32118c993437fb49061fcbe8f50cc8afb3dd3653401bb561c3d02a24ff85d7ff"} Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.566543 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-45vfv" event={"ID":"825c6b77-c03a-463c-b9a4-d26a1ac398f0","Type":"ContainerStarted","Data":"ccdbfdc51f663e0a673d48a4d03c4efc47d1bc66fe97ee784d2b2cb54a8d3d07"} Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.567397 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.567403 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.567443 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.572892 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jx27x" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.573165 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-r7ckk" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" Mar 20 10:59:17 crc kubenswrapper[4860]: E0320 10:59:17.573261 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-qq8bh" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.624427 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.666149 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-589db55d97-b7n5d"] Mar 20 10:59:17 crc kubenswrapper[4860]: I0320 10:59:17.682583 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht"] Mar 20 10:59:17 crc kubenswrapper[4860]: W0320 10:59:17.699858 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dfd8c66_3b51_4cf2_acbb_eb764785f6d3.slice/crio-c9e85d8e13725f9c1784bd56077ae640fa1d305a310b954cd20220316b4a0079 WatchSource:0}: Error finding container c9e85d8e13725f9c1784bd56077ae640fa1d305a310b954cd20220316b4a0079: Status 404 returned error can't find the container with id c9e85d8e13725f9c1784bd56077ae640fa1d305a310b954cd20220316b4a0079 Mar 20 10:59:17 crc kubenswrapper[4860]: W0320 10:59:17.700945 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b0ce2f8_6bac_4fd7_81ad_2478d13e62c9.slice/crio-fd4fb2650db90abfc2fc04379e97c1d923867f1354b3859056544eb5327ee868 WatchSource:0}: Error finding container fd4fb2650db90abfc2fc04379e97c1d923867f1354b3859056544eb5327ee868: Status 404 returned error can't find the container with id fd4fb2650db90abfc2fc04379e97c1d923867f1354b3859056544eb5327ee868 Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.603151 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"da7fc050-0408-49bf-a97d-3b5935573dc7","Type":"ContainerStarted","Data":"266bdd2b9062e1f53f946b2bc3199ddc199622c764b2624c18e3421ceef03cb2"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.605364 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"da7fc050-0408-49bf-a97d-3b5935573dc7","Type":"ContainerStarted","Data":"19d920be2e0ceb91c887b842ac3e676890de81e2b61d14681a49db1194b2588e"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.605477 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" event={"ID":"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9","Type":"ContainerStarted","Data":"16a49325d8d5b96f821cc45d49f4e0565898b38dd7a7655ded8401b745e9fbeb"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.605616 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.605716 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" event={"ID":"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9","Type":"ContainerStarted","Data":"fd4fb2650db90abfc2fc04379e97c1d923867f1354b3859056544eb5327ee868"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.608069 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d","Type":"ContainerStarted","Data":"8d39974009a23179ff960e42776dd6479d915e26394e49ae5752f3d385d29790"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.608139 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d","Type":"ContainerStarted","Data":"e4aa9d22701e1f5fe237a916fc50c217011c513544d30da810d469bc44fe2386"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.609859 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" event={"ID":"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3","Type":"ContainerStarted","Data":"7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.609892 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" event={"ID":"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3","Type":"ContainerStarted","Data":"c9e85d8e13725f9c1784bd56077ae640fa1d305a310b954cd20220316b4a0079"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.609969 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" podUID="0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" containerName="controller-manager" containerID="cri-o://7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f" gracePeriod=30 Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.610126 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.615787 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q85gq" event={"ID":"035f0b3d-92ee-4564-8dad-28b231e1c800","Type":"ContainerStarted","Data":"c4be3c5de2033f35854cf2c0ff9946500d6c0c890b9bd6674aa13c8d1bf3d782"} Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.616710 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.618264 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.618309 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.632592 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=10.632571202 podStartE2EDuration="10.632571202s" podCreationTimestamp="2026-03-20 10:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:18.631409519 +0000 UTC m=+282.852770417" watchObservedRunningTime="2026-03-20 10:59:18.632571202 +0000 UTC m=+282.853932100" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.653392 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" podStartSLOduration=14.653366431 podStartE2EDuration="14.653366431s" podCreationTimestamp="2026-03-20 10:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:18.649544813 +0000 UTC m=+282.870905731" watchObservedRunningTime="2026-03-20 10:59:18.653366431 +0000 UTC m=+282.874727339" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.668284 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-q85gq" podStartSLOduration=210.668256113 podStartE2EDuration="3m30.668256113s" podCreationTimestamp="2026-03-20 10:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:18.667625636 +0000 UTC m=+282.888986534" watchObservedRunningTime="2026-03-20 10:59:18.668256113 +0000 UTC m=+282.889617011" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.685674 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" podStartSLOduration=34.685644986 podStartE2EDuration="34.685644986s" podCreationTimestamp="2026-03-20 10:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:18.682678112 +0000 UTC m=+282.904039010" watchObservedRunningTime="2026-03-20 10:59:18.685644986 +0000 UTC m=+282.907005884" Mar 20 10:59:18 crc kubenswrapper[4860]: I0320 10:59:18.757092 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.544948 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.573504 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-678879cf8c-f74zz"] Mar 20 10:59:19 crc kubenswrapper[4860]: E0320 10:59:19.573821 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" containerName="controller-manager" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.573840 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" containerName="controller-manager" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.573991 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" containerName="controller-manager" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.574538 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.591561 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-678879cf8c-f74zz"] Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.624527 4860 generic.go:334] "Generic (PLEG): container finished" podID="da7fc050-0408-49bf-a97d-3b5935573dc7" containerID="266bdd2b9062e1f53f946b2bc3199ddc199622c764b2624c18e3421ceef03cb2" exitCode=0 Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.624608 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"da7fc050-0408-49bf-a97d-3b5935573dc7","Type":"ContainerDied","Data":"266bdd2b9062e1f53f946b2bc3199ddc199622c764b2624c18e3421ceef03cb2"} Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.625921 4860 generic.go:334] "Generic (PLEG): container finished" podID="0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" containerID="7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f" exitCode=0 Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.626564 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.626671 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" event={"ID":"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3","Type":"ContainerDied","Data":"7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f"} Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.626694 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-589db55d97-b7n5d" event={"ID":"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3","Type":"ContainerDied","Data":"c9e85d8e13725f9c1784bd56077ae640fa1d305a310b954cd20220316b4a0079"} Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.626712 4860 scope.go:117] "RemoveContainer" containerID="7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.628004 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.628048 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.650649 4860 scope.go:117] "RemoveContainer" containerID="7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f" Mar 20 10:59:19 crc kubenswrapper[4860]: E0320 10:59:19.651283 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f\": container with ID starting with 7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f not found: ID does not exist" containerID="7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.651329 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f"} err="failed to get container status \"7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f\": rpc error: code = NotFound desc = could not find container \"7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f\": container with ID starting with 7ad737b59cd7b7eee4027b7faa0b060cf06623e1799cc126eaa5ba23b4f1f09f not found: ID does not exist" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.666185 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=5.66615936 podStartE2EDuration="5.66615936s" podCreationTimestamp="2026-03-20 10:59:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:19.664639097 +0000 UTC m=+283.885999995" watchObservedRunningTime="2026-03-20 10:59:19.66615936 +0000 UTC m=+283.887520258" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.719463 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-serving-cert\") pod \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.719542 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-config\") pod \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.719583 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmctq\" (UniqueName: \"kubernetes.io/projected/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-kube-api-access-fmctq\") pod \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.719753 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-proxy-ca-bundles\") pod \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.719798 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-client-ca\") pod \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\" (UID: \"0dfd8c66-3b51-4cf2-acbb-eb764785f6d3\") " Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.720315 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-client-ca\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.720361 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a806156-ca3b-43dd-8b19-c072188004b7-serving-cert\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.720394 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-config\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.720443 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsk6l\" (UniqueName: \"kubernetes.io/projected/5a806156-ca3b-43dd-8b19-c072188004b7-kube-api-access-rsk6l\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.720525 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-proxy-ca-bundles\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.721156 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" (UID: "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.721163 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-client-ca" (OuterVolumeSpecName: "client-ca") pod "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" (UID: "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.721218 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-config" (OuterVolumeSpecName: "config") pod "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" (UID: "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.726962 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-kube-api-access-fmctq" (OuterVolumeSpecName: "kube-api-access-fmctq") pod "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" (UID: "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3"). InnerVolumeSpecName "kube-api-access-fmctq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.727784 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" (UID: "0dfd8c66-3b51-4cf2-acbb-eb764785f6d3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.821683 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-proxy-ca-bundles\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822133 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-client-ca\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822241 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a806156-ca3b-43dd-8b19-c072188004b7-serving-cert\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822321 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-config\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822452 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsk6l\" (UniqueName: \"kubernetes.io/projected/5a806156-ca3b-43dd-8b19-c072188004b7-kube-api-access-rsk6l\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822642 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822750 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822829 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmctq\" (UniqueName: \"kubernetes.io/projected/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-kube-api-access-fmctq\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.822896 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.823193 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-proxy-ca-bundles\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.823715 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.824850 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-client-ca\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.826278 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a806156-ca3b-43dd-8b19-c072188004b7-serving-cert\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.826730 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-config\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.846123 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsk6l\" (UniqueName: \"kubernetes.io/projected/5a806156-ca3b-43dd-8b19-c072188004b7-kube-api-access-rsk6l\") pod \"controller-manager-678879cf8c-f74zz\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.898938 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.980982 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-589db55d97-b7n5d"] Mar 20 10:59:19 crc kubenswrapper[4860]: I0320 10:59:19.993161 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-589db55d97-b7n5d"] Mar 20 10:59:20 crc kubenswrapper[4860]: I0320 10:59:20.125048 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-678879cf8c-f74zz"] Mar 20 10:59:20 crc kubenswrapper[4860]: W0320 10:59:20.125964 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a806156_ca3b_43dd_8b19_c072188004b7.slice/crio-bc7889716935ff28fa767611a372177d38ff47e7dcc741593b2d67958a68c43b WatchSource:0}: Error finding container bc7889716935ff28fa767611a372177d38ff47e7dcc741593b2d67958a68c43b: Status 404 returned error can't find the container with id bc7889716935ff28fa767611a372177d38ff47e7dcc741593b2d67958a68c43b Mar 20 10:59:20 crc kubenswrapper[4860]: I0320 10:59:20.636529 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" event={"ID":"5a806156-ca3b-43dd-8b19-c072188004b7","Type":"ContainerStarted","Data":"8e915f332982c6439672c93348e89229311f91ce1bb0d43f012cc3a9b09a0bfa"} Mar 20 10:59:20 crc kubenswrapper[4860]: I0320 10:59:20.637010 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" event={"ID":"5a806156-ca3b-43dd-8b19-c072188004b7","Type":"ContainerStarted","Data":"bc7889716935ff28fa767611a372177d38ff47e7dcc741593b2d67958a68c43b"} Mar 20 10:59:20 crc kubenswrapper[4860]: I0320 10:59:20.664238 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" podStartSLOduration=16.664193199 podStartE2EDuration="16.664193199s" podCreationTimestamp="2026-03-20 10:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:20.661239826 +0000 UTC m=+284.882600734" watchObservedRunningTime="2026-03-20 10:59:20.664193199 +0000 UTC m=+284.885554097" Mar 20 10:59:20 crc kubenswrapper[4860]: I0320 10:59:20.915990 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.056927 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da7fc050-0408-49bf-a97d-3b5935573dc7-kube-api-access\") pod \"da7fc050-0408-49bf-a97d-3b5935573dc7\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.058390 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da7fc050-0408-49bf-a97d-3b5935573dc7-kubelet-dir\") pod \"da7fc050-0408-49bf-a97d-3b5935573dc7\" (UID: \"da7fc050-0408-49bf-a97d-3b5935573dc7\") " Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.058863 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da7fc050-0408-49bf-a97d-3b5935573dc7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "da7fc050-0408-49bf-a97d-3b5935573dc7" (UID: "da7fc050-0408-49bf-a97d-3b5935573dc7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.070849 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da7fc050-0408-49bf-a97d-3b5935573dc7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "da7fc050-0408-49bf-a97d-3b5935573dc7" (UID: "da7fc050-0408-49bf-a97d-3b5935573dc7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.159921 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da7fc050-0408-49bf-a97d-3b5935573dc7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.159971 4860 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da7fc050-0408-49bf-a97d-3b5935573dc7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.419993 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dfd8c66-3b51-4cf2-acbb-eb764785f6d3" path="/var/lib/kubelet/pods/0dfd8c66-3b51-4cf2-acbb-eb764785f6d3/volumes" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.644469 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.644436 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"da7fc050-0408-49bf-a97d-3b5935573dc7","Type":"ContainerDied","Data":"19d920be2e0ceb91c887b842ac3e676890de81e2b61d14681a49db1194b2588e"} Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.644530 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19d920be2e0ceb91c887b842ac3e676890de81e2b61d14681a49db1194b2588e" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.646113 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:21 crc kubenswrapper[4860]: I0320 10:59:21.651263 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.344688 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.345176 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.345260 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.345838 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.345902 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda" gracePeriod=600 Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.388493 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.389191 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.388512 4860 patch_prober.go:28] interesting pod/downloads-7954f5f757-45vfv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.389423 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-45vfv" podUID="825c6b77-c03a-463c-b9a4-d26a1ac398f0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.651254 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda" exitCode=0 Mar 20 10:59:22 crc kubenswrapper[4860]: I0320 10:59:22.652059 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda"} Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.151474 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-678879cf8c-f74zz"] Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.165463 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht"] Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.165705 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" podUID="9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" containerName="route-controller-manager" containerID="cri-o://16a49325d8d5b96f821cc45d49f4e0565898b38dd7a7655ded8401b745e9fbeb" gracePeriod=30 Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.579174 4860 csr.go:261] certificate signing request csr-bl4br is approved, waiting to be issued Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.586828 4860 patch_prober.go:28] interesting pod/route-controller-manager-56668659b9-pdhht container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.586906 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" podUID="9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.590441 4860 csr.go:257] certificate signing request csr-bl4br is issued Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.665890 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"13fba41711384af20ab8200d2257307c76fbc844be8572bbe7995f53f6fb9ca6"} Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.667743 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-5cj22" event={"ID":"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236","Type":"ContainerStarted","Data":"133313b654a587091e098b1e8505700f3bdc77cfa2efebc3f2529891730788bc"} Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.669293 4860 generic.go:334] "Generic (PLEG): container finished" podID="9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" containerID="16a49325d8d5b96f821cc45d49f4e0565898b38dd7a7655ded8401b745e9fbeb" exitCode=0 Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.669377 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" event={"ID":"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9","Type":"ContainerDied","Data":"16a49325d8d5b96f821cc45d49f4e0565898b38dd7a7655ded8401b745e9fbeb"} Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.669479 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" podUID="5a806156-ca3b-43dd-8b19-c072188004b7" containerName="controller-manager" containerID="cri-o://8e915f332982c6439672c93348e89229311f91ce1bb0d43f012cc3a9b09a0bfa" gracePeriod=30 Mar 20 10:59:24 crc kubenswrapper[4860]: I0320 10:59:24.704589 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566738-5cj22" podStartSLOduration=26.574909835 podStartE2EDuration="1m24.704568508s" podCreationTimestamp="2026-03-20 10:58:00 +0000 UTC" firstStartedPulling="2026-03-20 10:58:25.246833699 +0000 UTC m=+229.468194597" lastFinishedPulling="2026-03-20 10:59:23.376492372 +0000 UTC m=+287.597853270" observedRunningTime="2026-03-20 10:59:24.701073708 +0000 UTC m=+288.922434606" watchObservedRunningTime="2026-03-20 10:59:24.704568508 +0000 UTC m=+288.925929406" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.383257 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.430488 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4"] Mar 20 10:59:25 crc kubenswrapper[4860]: E0320 10:59:25.430853 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da7fc050-0408-49bf-a97d-3b5935573dc7" containerName="pruner" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.430877 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7fc050-0408-49bf-a97d-3b5935573dc7" containerName="pruner" Mar 20 10:59:25 crc kubenswrapper[4860]: E0320 10:59:25.430895 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" containerName="route-controller-manager" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.430906 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" containerName="route-controller-manager" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.431063 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="da7fc050-0408-49bf-a97d-3b5935573dc7" containerName="pruner" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.431083 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" containerName="route-controller-manager" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.431738 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.434803 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4"] Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.438086 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59vs6\" (UniqueName: \"kubernetes.io/projected/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-kube-api-access-59vs6\") pod \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.438285 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-config\") pod \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.438439 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-serving-cert\") pod \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.438562 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-client-ca\") pod \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\" (UID: \"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.440883 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-config" (OuterVolumeSpecName: "config") pod "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" (UID: "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.441461 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-client-ca" (OuterVolumeSpecName: "client-ca") pod "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" (UID: "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.447051 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" (UID: "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.447195 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-kube-api-access-59vs6" (OuterVolumeSpecName: "kube-api-access-59vs6") pod "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" (UID: "9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9"). InnerVolumeSpecName "kube-api-access-59vs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.540808 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-client-ca\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.541293 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mht7p\" (UniqueName: \"kubernetes.io/projected/dd77d1cf-d25f-459c-95b4-96c63acd0462-kube-api-access-mht7p\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.541322 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd77d1cf-d25f-459c-95b4-96c63acd0462-serving-cert\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.541372 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-config\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.541418 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59vs6\" (UniqueName: \"kubernetes.io/projected/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-kube-api-access-59vs6\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.541429 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.541438 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.541450 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.593268 4860 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-04 23:10:15.981593354 +0000 UTC Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.593304 4860 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6228h10m50.388291814s for next certificate rotation Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.642353 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mht7p\" (UniqueName: \"kubernetes.io/projected/dd77d1cf-d25f-459c-95b4-96c63acd0462-kube-api-access-mht7p\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.642397 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd77d1cf-d25f-459c-95b4-96c63acd0462-serving-cert\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.642449 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-config\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.642506 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-client-ca\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.643819 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-client-ca\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.648767 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd77d1cf-d25f-459c-95b4-96c63acd0462-serving-cert\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.651543 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-config\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.661966 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mht7p\" (UniqueName: \"kubernetes.io/projected/dd77d1cf-d25f-459c-95b4-96c63acd0462-kube-api-access-mht7p\") pod \"route-controller-manager-6988ff758f-g2fv4\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.681096 4860 generic.go:334] "Generic (PLEG): container finished" podID="5a806156-ca3b-43dd-8b19-c072188004b7" containerID="8e915f332982c6439672c93348e89229311f91ce1bb0d43f012cc3a9b09a0bfa" exitCode=0 Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.681185 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" event={"ID":"5a806156-ca3b-43dd-8b19-c072188004b7","Type":"ContainerDied","Data":"8e915f332982c6439672c93348e89229311f91ce1bb0d43f012cc3a9b09a0bfa"} Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.683265 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" event={"ID":"9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9","Type":"ContainerDied","Data":"fd4fb2650db90abfc2fc04379e97c1d923867f1354b3859056544eb5327ee868"} Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.683325 4860 scope.go:117] "RemoveContainer" containerID="16a49325d8d5b96f821cc45d49f4e0565898b38dd7a7655ded8401b745e9fbeb" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.683549 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.690635 4860 generic.go:334] "Generic (PLEG): container finished" podID="ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" containerID="133313b654a587091e098b1e8505700f3bdc77cfa2efebc3f2529891730788bc" exitCode=0 Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.690881 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-5cj22" event={"ID":"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236","Type":"ContainerDied","Data":"133313b654a587091e098b1e8505700f3bdc77cfa2efebc3f2529891730788bc"} Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.750016 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.757048 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht"] Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.765109 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56668659b9-pdhht"] Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.867064 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.955630 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-proxy-ca-bundles\") pod \"5a806156-ca3b-43dd-8b19-c072188004b7\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.955999 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-config\") pod \"5a806156-ca3b-43dd-8b19-c072188004b7\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.956086 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a806156-ca3b-43dd-8b19-c072188004b7-serving-cert\") pod \"5a806156-ca3b-43dd-8b19-c072188004b7\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.956140 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsk6l\" (UniqueName: \"kubernetes.io/projected/5a806156-ca3b-43dd-8b19-c072188004b7-kube-api-access-rsk6l\") pod \"5a806156-ca3b-43dd-8b19-c072188004b7\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.956166 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-client-ca\") pod \"5a806156-ca3b-43dd-8b19-c072188004b7\" (UID: \"5a806156-ca3b-43dd-8b19-c072188004b7\") " Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.957014 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-client-ca" (OuterVolumeSpecName: "client-ca") pod "5a806156-ca3b-43dd-8b19-c072188004b7" (UID: "5a806156-ca3b-43dd-8b19-c072188004b7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.957004 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5a806156-ca3b-43dd-8b19-c072188004b7" (UID: "5a806156-ca3b-43dd-8b19-c072188004b7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.957047 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-config" (OuterVolumeSpecName: "config") pod "5a806156-ca3b-43dd-8b19-c072188004b7" (UID: "5a806156-ca3b-43dd-8b19-c072188004b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.963863 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a806156-ca3b-43dd-8b19-c072188004b7-kube-api-access-rsk6l" (OuterVolumeSpecName: "kube-api-access-rsk6l") pod "5a806156-ca3b-43dd-8b19-c072188004b7" (UID: "5a806156-ca3b-43dd-8b19-c072188004b7"). InnerVolumeSpecName "kube-api-access-rsk6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:25 crc kubenswrapper[4860]: I0320 10:59:25.964309 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a806156-ca3b-43dd-8b19-c072188004b7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5a806156-ca3b-43dd-8b19-c072188004b7" (UID: "5a806156-ca3b-43dd-8b19-c072188004b7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.059248 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.059529 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.062454 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a806156-ca3b-43dd-8b19-c072188004b7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.062477 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsk6l\" (UniqueName: \"kubernetes.io/projected/5a806156-ca3b-43dd-8b19-c072188004b7-kube-api-access-rsk6l\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.062492 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a806156-ca3b-43dd-8b19-c072188004b7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.354176 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4"] Mar 20 10:59:26 crc kubenswrapper[4860]: W0320 10:59:26.365086 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd77d1cf_d25f_459c_95b4_96c63acd0462.slice/crio-e2890569cb8b864dbb24f8804e49612f8e5c8d8c8aceaa430ef20c9b0f36ef98 WatchSource:0}: Error finding container e2890569cb8b864dbb24f8804e49612f8e5c8d8c8aceaa430ef20c9b0f36ef98: Status 404 returned error can't find the container with id e2890569cb8b864dbb24f8804e49612f8e5c8d8c8aceaa430ef20c9b0f36ef98 Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.603684 4860 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-08 04:00:06.258231721 +0000 UTC Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.603739 4860 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7049h0m39.654496559s for next certificate rotation Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.698127 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" event={"ID":"dd77d1cf-d25f-459c-95b4-96c63acd0462","Type":"ContainerStarted","Data":"7bce21f78b56a9d267f2935e0991aace1419bf6186a2be2b3f534d5ed5bf5798"} Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.698186 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" event={"ID":"dd77d1cf-d25f-459c-95b4-96c63acd0462","Type":"ContainerStarted","Data":"e2890569cb8b864dbb24f8804e49612f8e5c8d8c8aceaa430ef20c9b0f36ef98"} Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.704192 4860 generic.go:334] "Generic (PLEG): container finished" podID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerID="01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63" exitCode=0 Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.704294 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xlp" event={"ID":"f20cb95e-5480-4c9c-859f-0b03d679ab06","Type":"ContainerDied","Data":"01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63"} Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.706177 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" event={"ID":"5a806156-ca3b-43dd-8b19-c072188004b7","Type":"ContainerDied","Data":"bc7889716935ff28fa767611a372177d38ff47e7dcc741593b2d67958a68c43b"} Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.706241 4860 scope.go:117] "RemoveContainer" containerID="8e915f332982c6439672c93348e89229311f91ce1bb0d43f012cc3a9b09a0bfa" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.706329 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678879cf8c-f74zz" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.723034 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" podStartSLOduration=2.723005612 podStartE2EDuration="2.723005612s" podCreationTimestamp="2026-03-20 10:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:26.719465382 +0000 UTC m=+290.940826290" watchObservedRunningTime="2026-03-20 10:59:26.723005612 +0000 UTC m=+290.944366520" Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.771200 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-678879cf8c-f74zz"] Mar 20 10:59:26 crc kubenswrapper[4860]: I0320 10:59:26.777417 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-678879cf8c-f74zz"] Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.073821 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-5cj22" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.180418 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hjpq\" (UniqueName: \"kubernetes.io/projected/ba2ab33e-6ecc-4eac-9aaa-256e6ff68236-kube-api-access-5hjpq\") pod \"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236\" (UID: \"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236\") " Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.189070 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba2ab33e-6ecc-4eac-9aaa-256e6ff68236-kube-api-access-5hjpq" (OuterVolumeSpecName: "kube-api-access-5hjpq") pod "ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" (UID: "ba2ab33e-6ecc-4eac-9aaa-256e6ff68236"). InnerVolumeSpecName "kube-api-access-5hjpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.283960 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hjpq\" (UniqueName: \"kubernetes.io/projected/ba2ab33e-6ecc-4eac-9aaa-256e6ff68236-kube-api-access-5hjpq\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.421129 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a806156-ca3b-43dd-8b19-c072188004b7" path="/var/lib/kubelet/pods/5a806156-ca3b-43dd-8b19-c072188004b7/volumes" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.423710 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9" path="/var/lib/kubelet/pods/9b0ce2f8-6bac-4fd7-81ad-2478d13e62c9/volumes" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.726857 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-5cj22" event={"ID":"ba2ab33e-6ecc-4eac-9aaa-256e6ff68236","Type":"ContainerDied","Data":"ea1a7118d7d4729065b9248b97584b35507102283df7254e36c0c2abc1c111d1"} Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.726910 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea1a7118d7d4729065b9248b97584b35507102283df7254e36c0c2abc1c111d1" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.728291 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-5cj22" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.729979 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:27 crc kubenswrapper[4860]: I0320 10:59:27.748309 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.091565 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv"] Mar 20 10:59:28 crc kubenswrapper[4860]: E0320 10:59:28.092161 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" containerName="oc" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.092189 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" containerName="oc" Mar 20 10:59:28 crc kubenswrapper[4860]: E0320 10:59:28.092208 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a806156-ca3b-43dd-8b19-c072188004b7" containerName="controller-manager" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.092251 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a806156-ca3b-43dd-8b19-c072188004b7" containerName="controller-manager" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.092406 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" containerName="oc" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.092447 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a806156-ca3b-43dd-8b19-c072188004b7" containerName="controller-manager" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.093011 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.103528 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.104342 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.105060 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv"] Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.115914 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.117033 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.117708 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.121192 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.130567 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.222873 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-client-ca\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.222948 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-config\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.222987 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-proxy-ca-bundles\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.223010 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmgqj\" (UniqueName: \"kubernetes.io/projected/d61e9234-ffb5-44de-b42c-3c4a3028a994-kube-api-access-tmgqj\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.223043 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d61e9234-ffb5-44de-b42c-3c4a3028a994-serving-cert\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.324705 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-proxy-ca-bundles\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.324839 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmgqj\" (UniqueName: \"kubernetes.io/projected/d61e9234-ffb5-44de-b42c-3c4a3028a994-kube-api-access-tmgqj\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.324898 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d61e9234-ffb5-44de-b42c-3c4a3028a994-serving-cert\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.325002 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-client-ca\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.325048 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-config\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.326038 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-client-ca\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.326555 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-proxy-ca-bundles\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.327178 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-config\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.342672 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d61e9234-ffb5-44de-b42c-3c4a3028a994-serving-cert\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.346017 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmgqj\" (UniqueName: \"kubernetes.io/projected/d61e9234-ffb5-44de-b42c-3c4a3028a994-kube-api-access-tmgqj\") pod \"controller-manager-6fc74dd9cd-gwtmv\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:28 crc kubenswrapper[4860]: I0320 10:59:28.429296 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:29 crc kubenswrapper[4860]: I0320 10:59:29.744916 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xlp" event={"ID":"f20cb95e-5480-4c9c-859f-0b03d679ab06","Type":"ContainerStarted","Data":"3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f"} Mar 20 10:59:29 crc kubenswrapper[4860]: I0320 10:59:29.749802 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5w95" event={"ID":"4f84f111-5991-4e78-9508-82283b8e36f7","Type":"ContainerStarted","Data":"1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532"} Mar 20 10:59:29 crc kubenswrapper[4860]: I0320 10:59:29.774984 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv"] Mar 20 10:59:29 crc kubenswrapper[4860]: I0320 10:59:29.781781 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d9xlp" podStartSLOduration=2.455059678 podStartE2EDuration="57.781757975s" podCreationTimestamp="2026-03-20 10:58:32 +0000 UTC" firstStartedPulling="2026-03-20 10:58:33.978484559 +0000 UTC m=+238.199845457" lastFinishedPulling="2026-03-20 10:59:29.305182856 +0000 UTC m=+293.526543754" observedRunningTime="2026-03-20 10:59:29.769894929 +0000 UTC m=+293.991255857" watchObservedRunningTime="2026-03-20 10:59:29.781757975 +0000 UTC m=+294.003118873" Mar 20 10:59:30 crc kubenswrapper[4860]: I0320 10:59:30.757585 4860 generic.go:334] "Generic (PLEG): container finished" podID="4f84f111-5991-4e78-9508-82283b8e36f7" containerID="1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532" exitCode=0 Mar 20 10:59:30 crc kubenswrapper[4860]: I0320 10:59:30.757683 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5w95" event={"ID":"4f84f111-5991-4e78-9508-82283b8e36f7","Type":"ContainerDied","Data":"1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532"} Mar 20 10:59:30 crc kubenswrapper[4860]: I0320 10:59:30.765924 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" event={"ID":"d61e9234-ffb5-44de-b42c-3c4a3028a994","Type":"ContainerStarted","Data":"411ec5427fcb58cf525e6877c9bd13ab7217beeed0854afb3fa90c2410f789ec"} Mar 20 10:59:30 crc kubenswrapper[4860]: I0320 10:59:30.765967 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" event={"ID":"d61e9234-ffb5-44de-b42c-3c4a3028a994","Type":"ContainerStarted","Data":"b8dcffcb468a289a993f7232cdab0940ede87b61eba385b847779a075838b8d1"} Mar 20 10:59:30 crc kubenswrapper[4860]: I0320 10:59:30.766203 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:30 crc kubenswrapper[4860]: I0320 10:59:30.781535 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:30 crc kubenswrapper[4860]: I0320 10:59:30.802954 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" podStartSLOduration=6.80292667 podStartE2EDuration="6.80292667s" podCreationTimestamp="2026-03-20 10:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:30.801761117 +0000 UTC m=+295.023122015" watchObservedRunningTime="2026-03-20 10:59:30.80292667 +0000 UTC m=+295.024287568" Mar 20 10:59:31 crc kubenswrapper[4860]: I0320 10:59:31.236522 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-srz5x"] Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.399894 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-45vfv" Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.462207 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.462418 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.785682 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79b7" event={"ID":"d2690d8b-c7f7-4e71-af44-33444e4d6187","Type":"ContainerStarted","Data":"dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a"} Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.795061 4860 generic.go:334] "Generic (PLEG): container finished" podID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerID="c56ad28f23266e55b726d6def152f7daf70f2346cb9ad3e38d40fd7ed925aeca" exitCode=0 Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.795148 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6sh9" event={"ID":"7b622f82-e01c-42b8-8061-16b6e8f551fb","Type":"ContainerDied","Data":"c56ad28f23266e55b726d6def152f7daf70f2346cb9ad3e38d40fd7ed925aeca"} Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.801621 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpww" event={"ID":"2268b7ae-c1db-4ef4-8236-60f7cfa277a1","Type":"ContainerStarted","Data":"a570fccad49b61cba0e967c8a578dc38074b1cc636e9a24780ad0104b28b074c"} Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.806571 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerStarted","Data":"5000df7098db21086cd500235d0dfe1cd2c6c277e01022d3498595c652a31046"} Mar 20 10:59:32 crc kubenswrapper[4860]: I0320 10:59:32.811542 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7ckk" event={"ID":"f0e14a08-824b-450f-bf98-2a476da0d44b","Type":"ContainerStarted","Data":"a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6"} Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.778901 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-d9xlp" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="registry-server" probeResult="failure" output=< Mar 20 10:59:33 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 10:59:33 crc kubenswrapper[4860]: > Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.820331 4860 generic.go:334] "Generic (PLEG): container finished" podID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerID="5000df7098db21086cd500235d0dfe1cd2c6c277e01022d3498595c652a31046" exitCode=0 Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.820413 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerDied","Data":"5000df7098db21086cd500235d0dfe1cd2c6c277e01022d3498595c652a31046"} Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.823686 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5w95" event={"ID":"4f84f111-5991-4e78-9508-82283b8e36f7","Type":"ContainerStarted","Data":"bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff"} Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.826767 4860 generic.go:334] "Generic (PLEG): container finished" podID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerID="a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6" exitCode=0 Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.826867 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7ckk" event={"ID":"f0e14a08-824b-450f-bf98-2a476da0d44b","Type":"ContainerDied","Data":"a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6"} Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.837825 4860 generic.go:334] "Generic (PLEG): container finished" podID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerID="dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a" exitCode=0 Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.837963 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79b7" event={"ID":"d2690d8b-c7f7-4e71-af44-33444e4d6187","Type":"ContainerDied","Data":"dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a"} Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.855129 4860 generic.go:334] "Generic (PLEG): container finished" podID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerID="a570fccad49b61cba0e967c8a578dc38074b1cc636e9a24780ad0104b28b074c" exitCode=0 Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.855197 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpww" event={"ID":"2268b7ae-c1db-4ef4-8236-60f7cfa277a1","Type":"ContainerDied","Data":"a570fccad49b61cba0e967c8a578dc38074b1cc636e9a24780ad0104b28b074c"} Mar 20 10:59:33 crc kubenswrapper[4860]: I0320 10:59:33.870301 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w5w95" podStartSLOduration=4.472739806 podStartE2EDuration="1m3.870261727s" podCreationTimestamp="2026-03-20 10:58:30 +0000 UTC" firstStartedPulling="2026-03-20 10:58:32.964598797 +0000 UTC m=+237.185959695" lastFinishedPulling="2026-03-20 10:59:32.362120718 +0000 UTC m=+296.583481616" observedRunningTime="2026-03-20 10:59:33.865497472 +0000 UTC m=+298.086858390" watchObservedRunningTime="2026-03-20 10:59:33.870261727 +0000 UTC m=+298.091622635" Mar 20 10:59:35 crc kubenswrapper[4860]: I0320 10:59:35.869795 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6sh9" event={"ID":"7b622f82-e01c-42b8-8061-16b6e8f551fb","Type":"ContainerStarted","Data":"7dbc6b1d3db25220d5d86cbd542a694d0703ce989da64d73e6670a17039b2234"} Mar 20 10:59:35 crc kubenswrapper[4860]: I0320 10:59:35.894560 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p6sh9" podStartSLOduration=3.780842911 podStartE2EDuration="1m5.894518046s" podCreationTimestamp="2026-03-20 10:58:30 +0000 UTC" firstStartedPulling="2026-03-20 10:58:32.783419718 +0000 UTC m=+237.004780616" lastFinishedPulling="2026-03-20 10:59:34.897094853 +0000 UTC m=+299.118455751" observedRunningTime="2026-03-20 10:59:35.889381351 +0000 UTC m=+300.110742279" watchObservedRunningTime="2026-03-20 10:59:35.894518046 +0000 UTC m=+300.115878944" Mar 20 10:59:40 crc kubenswrapper[4860]: I0320 10:59:40.984090 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:59:40 crc kubenswrapper[4860]: I0320 10:59:40.984883 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.063056 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.181146 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.181216 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.227999 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.913514 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpww" event={"ID":"2268b7ae-c1db-4ef4-8236-60f7cfa277a1","Type":"ContainerStarted","Data":"e9d5be622305f211c3e12e12dfe924fbcb59aee9d65c271786e393a4e06cec5a"} Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.916405 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8bh" event={"ID":"514f05c3-1404-46c6-9f4d-68437ea8ee0b","Type":"ContainerStarted","Data":"844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22"} Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.965415 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 10:59:41 crc kubenswrapper[4860]: I0320 10:59:41.980967 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:59:42 crc kubenswrapper[4860]: I0320 10:59:42.524032 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:59:42 crc kubenswrapper[4860]: I0320 10:59:42.574082 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 10:59:42 crc kubenswrapper[4860]: I0320 10:59:42.649699 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p6sh9"] Mar 20 10:59:42 crc kubenswrapper[4860]: I0320 10:59:42.926503 4860 generic.go:334] "Generic (PLEG): container finished" podID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerID="844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22" exitCode=0 Mar 20 10:59:42 crc kubenswrapper[4860]: I0320 10:59:42.927073 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8bh" event={"ID":"514f05c3-1404-46c6-9f4d-68437ea8ee0b","Type":"ContainerDied","Data":"844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22"} Mar 20 10:59:42 crc kubenswrapper[4860]: I0320 10:59:42.970837 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5jpww" podStartSLOduration=4.484042726 podStartE2EDuration="1m10.97080895s" podCreationTimestamp="2026-03-20 10:58:32 +0000 UTC" firstStartedPulling="2026-03-20 10:58:33.988841558 +0000 UTC m=+238.210202456" lastFinishedPulling="2026-03-20 10:59:40.475607782 +0000 UTC m=+304.696968680" observedRunningTime="2026-03-20 10:59:42.968131074 +0000 UTC m=+307.189491972" watchObservedRunningTime="2026-03-20 10:59:42.97080895 +0000 UTC m=+307.192169858" Mar 20 10:59:43 crc kubenswrapper[4860]: I0320 10:59:43.932989 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p6sh9" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="registry-server" containerID="cri-o://7dbc6b1d3db25220d5d86cbd542a694d0703ce989da64d73e6670a17039b2234" gracePeriod=2 Mar 20 10:59:44 crc kubenswrapper[4860]: I0320 10:59:44.133692 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv"] Mar 20 10:59:44 crc kubenswrapper[4860]: I0320 10:59:44.134024 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" podUID="d61e9234-ffb5-44de-b42c-3c4a3028a994" containerName="controller-manager" containerID="cri-o://411ec5427fcb58cf525e6877c9bd13ab7217beeed0854afb3fa90c2410f789ec" gracePeriod=30 Mar 20 10:59:44 crc kubenswrapper[4860]: I0320 10:59:44.234044 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4"] Mar 20 10:59:44 crc kubenswrapper[4860]: I0320 10:59:44.234332 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" podUID="dd77d1cf-d25f-459c-95b4-96c63acd0462" containerName="route-controller-manager" containerID="cri-o://7bce21f78b56a9d267f2935e0991aace1419bf6186a2be2b3f534d5ed5bf5798" gracePeriod=30 Mar 20 10:59:44 crc kubenswrapper[4860]: I0320 10:59:44.941932 4860 generic.go:334] "Generic (PLEG): container finished" podID="d61e9234-ffb5-44de-b42c-3c4a3028a994" containerID="411ec5427fcb58cf525e6877c9bd13ab7217beeed0854afb3fa90c2410f789ec" exitCode=0 Mar 20 10:59:44 crc kubenswrapper[4860]: I0320 10:59:44.942166 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" event={"ID":"d61e9234-ffb5-44de-b42c-3c4a3028a994","Type":"ContainerDied","Data":"411ec5427fcb58cf525e6877c9bd13ab7217beeed0854afb3fa90c2410f789ec"} Mar 20 10:59:45 crc kubenswrapper[4860]: I0320 10:59:45.751476 4860 patch_prober.go:28] interesting pod/route-controller-manager-6988ff758f-g2fv4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Mar 20 10:59:45 crc kubenswrapper[4860]: I0320 10:59:45.751571 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" podUID="dd77d1cf-d25f-459c-95b4-96c63acd0462" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Mar 20 10:59:45 crc kubenswrapper[4860]: I0320 10:59:45.969411 4860 generic.go:334] "Generic (PLEG): container finished" podID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerID="7dbc6b1d3db25220d5d86cbd542a694d0703ce989da64d73e6670a17039b2234" exitCode=0 Mar 20 10:59:45 crc kubenswrapper[4860]: I0320 10:59:45.969516 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6sh9" event={"ID":"7b622f82-e01c-42b8-8061-16b6e8f551fb","Type":"ContainerDied","Data":"7dbc6b1d3db25220d5d86cbd542a694d0703ce989da64d73e6670a17039b2234"} Mar 20 10:59:45 crc kubenswrapper[4860]: I0320 10:59:45.971989 4860 generic.go:334] "Generic (PLEG): container finished" podID="dd77d1cf-d25f-459c-95b4-96c63acd0462" containerID="7bce21f78b56a9d267f2935e0991aace1419bf6186a2be2b3f534d5ed5bf5798" exitCode=0 Mar 20 10:59:45 crc kubenswrapper[4860]: I0320 10:59:45.972036 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" event={"ID":"dd77d1cf-d25f-459c-95b4-96c63acd0462","Type":"ContainerDied","Data":"7bce21f78b56a9d267f2935e0991aace1419bf6186a2be2b3f534d5ed5bf5798"} Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.929576 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.964430 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m"] Mar 20 10:59:46 crc kubenswrapper[4860]: E0320 10:59:46.964738 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61e9234-ffb5-44de-b42c-3c4a3028a994" containerName="controller-manager" Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.964756 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61e9234-ffb5-44de-b42c-3c4a3028a994" containerName="controller-manager" Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.964904 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61e9234-ffb5-44de-b42c-3c4a3028a994" containerName="controller-manager" Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.965426 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.984793 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m"] Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.990477 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" event={"ID":"d61e9234-ffb5-44de-b42c-3c4a3028a994","Type":"ContainerDied","Data":"b8dcffcb468a289a993f7232cdab0940ede87b61eba385b847779a075838b8d1"} Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.990573 4860 scope.go:117] "RemoveContainer" containerID="411ec5427fcb58cf525e6877c9bd13ab7217beeed0854afb3fa90c2410f789ec" Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.990744 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv" Mar 20 10:59:46 crc kubenswrapper[4860]: I0320 10:59:46.995159 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79b7" event={"ID":"d2690d8b-c7f7-4e71-af44-33444e4d6187","Type":"ContainerStarted","Data":"8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4"} Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041084 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-config\") pod \"d61e9234-ffb5-44de-b42c-3c4a3028a994\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041209 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-client-ca\") pod \"d61e9234-ffb5-44de-b42c-3c4a3028a994\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041347 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmgqj\" (UniqueName: \"kubernetes.io/projected/d61e9234-ffb5-44de-b42c-3c4a3028a994-kube-api-access-tmgqj\") pod \"d61e9234-ffb5-44de-b42c-3c4a3028a994\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041439 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d61e9234-ffb5-44de-b42c-3c4a3028a994-serving-cert\") pod \"d61e9234-ffb5-44de-b42c-3c4a3028a994\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041473 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-proxy-ca-bundles\") pod \"d61e9234-ffb5-44de-b42c-3c4a3028a994\" (UID: \"d61e9234-ffb5-44de-b42c-3c4a3028a994\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041719 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-serving-cert\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041768 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-proxy-ca-bundles\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041800 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-config\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041851 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74fmb\" (UniqueName: \"kubernetes.io/projected/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-kube-api-access-74fmb\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.041924 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-client-ca\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.042209 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-config" (OuterVolumeSpecName: "config") pod "d61e9234-ffb5-44de-b42c-3c4a3028a994" (UID: "d61e9234-ffb5-44de-b42c-3c4a3028a994"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.043434 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-client-ca" (OuterVolumeSpecName: "client-ca") pod "d61e9234-ffb5-44de-b42c-3c4a3028a994" (UID: "d61e9234-ffb5-44de-b42c-3c4a3028a994"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.044463 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d61e9234-ffb5-44de-b42c-3c4a3028a994" (UID: "d61e9234-ffb5-44de-b42c-3c4a3028a994"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.050717 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d61e9234-ffb5-44de-b42c-3c4a3028a994-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d61e9234-ffb5-44de-b42c-3c4a3028a994" (UID: "d61e9234-ffb5-44de-b42c-3c4a3028a994"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.053481 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d61e9234-ffb5-44de-b42c-3c4a3028a994-kube-api-access-tmgqj" (OuterVolumeSpecName: "kube-api-access-tmgqj") pod "d61e9234-ffb5-44de-b42c-3c4a3028a994" (UID: "d61e9234-ffb5-44de-b42c-3c4a3028a994"). InnerVolumeSpecName "kube-api-access-tmgqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.143598 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74fmb\" (UniqueName: \"kubernetes.io/projected/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-kube-api-access-74fmb\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.148080 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-client-ca\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.148259 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-serving-cert\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.148329 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-proxy-ca-bundles\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.150446 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-config\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.150060 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-proxy-ca-bundles\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.150650 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.149429 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-client-ca\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.150667 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmgqj\" (UniqueName: \"kubernetes.io/projected/d61e9234-ffb5-44de-b42c-3c4a3028a994-kube-api-access-tmgqj\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.150679 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d61e9234-ffb5-44de-b42c-3c4a3028a994-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.150689 4860 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.150704 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61e9234-ffb5-44de-b42c-3c4a3028a994-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.151953 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-config\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.165853 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-serving-cert\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.170285 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74fmb\" (UniqueName: \"kubernetes.io/projected/3a1c3ede-cf8a-4fc1-9a11-642b5546723c-kube-api-access-74fmb\") pod \"controller-manager-76ccc9cdd4-wwr8m\" (UID: \"3a1c3ede-cf8a-4fc1-9a11-642b5546723c\") " pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.234862 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.261383 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n79b7" podStartSLOduration=5.804290566 podStartE2EDuration="1m17.261358148s" podCreationTimestamp="2026-03-20 10:58:30 +0000 UTC" firstStartedPulling="2026-03-20 10:58:32.765814558 +0000 UTC m=+236.987175456" lastFinishedPulling="2026-03-20 10:59:44.22288215 +0000 UTC m=+308.444243038" observedRunningTime="2026-03-20 10:59:47.023338361 +0000 UTC m=+311.244699259" watchObservedRunningTime="2026-03-20 10:59:47.261358148 +0000 UTC m=+311.482719036" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.281760 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.285755 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.326003 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv"] Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.328648 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6fc74dd9cd-gwtmv"] Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.352950 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-utilities\") pod \"7b622f82-e01c-42b8-8061-16b6e8f551fb\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.353112 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77rcr\" (UniqueName: \"kubernetes.io/projected/7b622f82-e01c-42b8-8061-16b6e8f551fb-kube-api-access-77rcr\") pod \"7b622f82-e01c-42b8-8061-16b6e8f551fb\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.353183 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-catalog-content\") pod \"7b622f82-e01c-42b8-8061-16b6e8f551fb\" (UID: \"7b622f82-e01c-42b8-8061-16b6e8f551fb\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.353830 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-utilities" (OuterVolumeSpecName: "utilities") pod "7b622f82-e01c-42b8-8061-16b6e8f551fb" (UID: "7b622f82-e01c-42b8-8061-16b6e8f551fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.356657 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b622f82-e01c-42b8-8061-16b6e8f551fb-kube-api-access-77rcr" (OuterVolumeSpecName: "kube-api-access-77rcr") pod "7b622f82-e01c-42b8-8061-16b6e8f551fb" (UID: "7b622f82-e01c-42b8-8061-16b6e8f551fb"). InnerVolumeSpecName "kube-api-access-77rcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.408832 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b622f82-e01c-42b8-8061-16b6e8f551fb" (UID: "7b622f82-e01c-42b8-8061-16b6e8f551fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.422397 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d61e9234-ffb5-44de-b42c-3c4a3028a994" path="/var/lib/kubelet/pods/d61e9234-ffb5-44de-b42c-3c4a3028a994/volumes" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.454452 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mht7p\" (UniqueName: \"kubernetes.io/projected/dd77d1cf-d25f-459c-95b4-96c63acd0462-kube-api-access-mht7p\") pod \"dd77d1cf-d25f-459c-95b4-96c63acd0462\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.454525 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-config\") pod \"dd77d1cf-d25f-459c-95b4-96c63acd0462\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.454565 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-client-ca\") pod \"dd77d1cf-d25f-459c-95b4-96c63acd0462\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.454630 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd77d1cf-d25f-459c-95b4-96c63acd0462-serving-cert\") pod \"dd77d1cf-d25f-459c-95b4-96c63acd0462\" (UID: \"dd77d1cf-d25f-459c-95b4-96c63acd0462\") " Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.454994 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77rcr\" (UniqueName: \"kubernetes.io/projected/7b622f82-e01c-42b8-8061-16b6e8f551fb-kube-api-access-77rcr\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.455019 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.455029 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b622f82-e01c-42b8-8061-16b6e8f551fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.457494 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-config" (OuterVolumeSpecName: "config") pod "dd77d1cf-d25f-459c-95b4-96c63acd0462" (UID: "dd77d1cf-d25f-459c-95b4-96c63acd0462"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.457918 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-client-ca" (OuterVolumeSpecName: "client-ca") pod "dd77d1cf-d25f-459c-95b4-96c63acd0462" (UID: "dd77d1cf-d25f-459c-95b4-96c63acd0462"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.459614 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd77d1cf-d25f-459c-95b4-96c63acd0462-kube-api-access-mht7p" (OuterVolumeSpecName: "kube-api-access-mht7p") pod "dd77d1cf-d25f-459c-95b4-96c63acd0462" (UID: "dd77d1cf-d25f-459c-95b4-96c63acd0462"). InnerVolumeSpecName "kube-api-access-mht7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.462134 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd77d1cf-d25f-459c-95b4-96c63acd0462-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dd77d1cf-d25f-459c-95b4-96c63acd0462" (UID: "dd77d1cf-d25f-459c-95b4-96c63acd0462"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.581872 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mht7p\" (UniqueName: \"kubernetes.io/projected/dd77d1cf-d25f-459c-95b4-96c63acd0462-kube-api-access-mht7p\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.581920 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.581934 4860 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd77d1cf-d25f-459c-95b4-96c63acd0462-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:47 crc kubenswrapper[4860]: I0320 10:59:47.581949 4860 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd77d1cf-d25f-459c-95b4-96c63acd0462-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.004439 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6sh9" event={"ID":"7b622f82-e01c-42b8-8061-16b6e8f551fb","Type":"ContainerDied","Data":"a0b846fa7e38edf968a2ecba37cb073cb9550be5be8e0d139e448911ef2ef8dd"} Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.004495 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6sh9" Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.004536 4860 scope.go:117] "RemoveContainer" containerID="7dbc6b1d3db25220d5d86cbd542a694d0703ce989da64d73e6670a17039b2234" Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.010646 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" event={"ID":"dd77d1cf-d25f-459c-95b4-96c63acd0462","Type":"ContainerDied","Data":"e2890569cb8b864dbb24f8804e49612f8e5c8d8c8aceaa430ef20c9b0f36ef98"} Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.010783 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4" Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.033356 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p6sh9"] Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.040625 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p6sh9"] Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.047622 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4"] Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.050509 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6988ff758f-g2fv4"] Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.390127 4860 scope.go:117] "RemoveContainer" containerID="c56ad28f23266e55b726d6def152f7daf70f2346cb9ad3e38d40fd7ed925aeca" Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.446195 4860 scope.go:117] "RemoveContainer" containerID="e24cde6154df13c72246e903afddf246ae7bde629e68f46946db4ded716f4fbf" Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.479623 4860 scope.go:117] "RemoveContainer" containerID="7bce21f78b56a9d267f2935e0991aace1419bf6186a2be2b3f534d5ed5bf5798" Mar 20 10:59:48 crc kubenswrapper[4860]: I0320 10:59:48.661190 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m"] Mar 20 10:59:48 crc kubenswrapper[4860]: W0320 10:59:48.677540 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a1c3ede_cf8a_4fc1_9a11_642b5546723c.slice/crio-c75901e95ec374b4b949a3afd9193502e556912cf621619a18f896cd80ddbb54 WatchSource:0}: Error finding container c75901e95ec374b4b949a3afd9193502e556912cf621619a18f896cd80ddbb54: Status 404 returned error can't find the container with id c75901e95ec374b4b949a3afd9193502e556912cf621619a18f896cd80ddbb54 Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.021670 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" event={"ID":"3a1c3ede-cf8a-4fc1-9a11-642b5546723c","Type":"ContainerStarted","Data":"c75901e95ec374b4b949a3afd9193502e556912cf621619a18f896cd80ddbb54"} Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.104707 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h"] Mar 20 10:59:49 crc kubenswrapper[4860]: E0320 10:59:49.104944 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="extract-utilities" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.104958 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="extract-utilities" Mar 20 10:59:49 crc kubenswrapper[4860]: E0320 10:59:49.104971 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd77d1cf-d25f-459c-95b4-96c63acd0462" containerName="route-controller-manager" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.104977 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd77d1cf-d25f-459c-95b4-96c63acd0462" containerName="route-controller-manager" Mar 20 10:59:49 crc kubenswrapper[4860]: E0320 10:59:49.104986 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="registry-server" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.104992 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="registry-server" Mar 20 10:59:49 crc kubenswrapper[4860]: E0320 10:59:49.105009 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="extract-content" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.105014 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="extract-content" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.105116 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd77d1cf-d25f-459c-95b4-96c63acd0462" containerName="route-controller-manager" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.105129 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" containerName="registry-server" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.105599 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.108416 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.108703 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.108965 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.109116 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.109332 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.109355 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.113509 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h"] Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.213125 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2zw2\" (UniqueName: \"kubernetes.io/projected/a7b77bd7-65cb-402d-a918-3b3e8457b656-kube-api-access-v2zw2\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.213630 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7b77bd7-65cb-402d-a918-3b3e8457b656-client-ca\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.213657 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7b77bd7-65cb-402d-a918-3b3e8457b656-serving-cert\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.213679 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7b77bd7-65cb-402d-a918-3b3e8457b656-config\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.314556 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2zw2\" (UniqueName: \"kubernetes.io/projected/a7b77bd7-65cb-402d-a918-3b3e8457b656-kube-api-access-v2zw2\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.314609 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7b77bd7-65cb-402d-a918-3b3e8457b656-client-ca\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.314633 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7b77bd7-65cb-402d-a918-3b3e8457b656-serving-cert\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.314659 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7b77bd7-65cb-402d-a918-3b3e8457b656-config\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.315819 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7b77bd7-65cb-402d-a918-3b3e8457b656-client-ca\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.315927 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7b77bd7-65cb-402d-a918-3b3e8457b656-config\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.325169 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7b77bd7-65cb-402d-a918-3b3e8457b656-serving-cert\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.337085 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2zw2\" (UniqueName: \"kubernetes.io/projected/a7b77bd7-65cb-402d-a918-3b3e8457b656-kube-api-access-v2zw2\") pod \"route-controller-manager-6656bbf56-bpd7h\" (UID: \"a7b77bd7-65cb-402d-a918-3b3e8457b656\") " pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.420908 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b622f82-e01c-42b8-8061-16b6e8f551fb" path="/var/lib/kubelet/pods/7b622f82-e01c-42b8-8061-16b6e8f551fb/volumes" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.421858 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd77d1cf-d25f-459c-95b4-96c63acd0462" path="/var/lib/kubelet/pods/dd77d1cf-d25f-459c-95b4-96c63acd0462/volumes" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.424763 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:49 crc kubenswrapper[4860]: I0320 10:59:49.694011 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h"] Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.030015 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" event={"ID":"a7b77bd7-65cb-402d-a918-3b3e8457b656","Type":"ContainerStarted","Data":"ef27fae7a5852a1262f2e378fa238cc0332680623e5aec34999ac8b5d2ec62a0"} Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.030542 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" event={"ID":"a7b77bd7-65cb-402d-a918-3b3e8457b656","Type":"ContainerStarted","Data":"9ea70187f1142ba967548126a5cae21547c6b783bca186f712cd7eea24603a59"} Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.030575 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.035548 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8bh" event={"ID":"514f05c3-1404-46c6-9f4d-68437ea8ee0b","Type":"ContainerStarted","Data":"dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958"} Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.039776 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerStarted","Data":"19e5b54cfca0bde6cf7f8393230c7ed69a1c9b30a1db1b83c9b87021655f765e"} Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.043601 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7ckk" event={"ID":"f0e14a08-824b-450f-bf98-2a476da0d44b","Type":"ContainerStarted","Data":"1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4"} Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.045617 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" event={"ID":"3a1c3ede-cf8a-4fc1-9a11-642b5546723c","Type":"ContainerStarted","Data":"697843f7d5d53f9ae0d486b963ad3259397c8e7cb3a46cd16fd19db2821f91d0"} Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.046375 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.053099 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.055574 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" podStartSLOduration=6.055550101 podStartE2EDuration="6.055550101s" podCreationTimestamp="2026-03-20 10:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:50.054066289 +0000 UTC m=+314.275427197" watchObservedRunningTime="2026-03-20 10:59:50.055550101 +0000 UTC m=+314.276910999" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.082907 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qq8bh" podStartSLOduration=3.765153021 podStartE2EDuration="1m17.082880146s" podCreationTimestamp="2026-03-20 10:58:33 +0000 UTC" firstStartedPulling="2026-03-20 10:58:35.072532412 +0000 UTC m=+239.293893310" lastFinishedPulling="2026-03-20 10:59:48.390259537 +0000 UTC m=+312.611620435" observedRunningTime="2026-03-20 10:59:50.076009451 +0000 UTC m=+314.297370369" watchObservedRunningTime="2026-03-20 10:59:50.082880146 +0000 UTC m=+314.304241054" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.096600 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76ccc9cdd4-wwr8m" podStartSLOduration=6.096575424 podStartE2EDuration="6.096575424s" podCreationTimestamp="2026-03-20 10:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:50.094598918 +0000 UTC m=+314.315959836" watchObservedRunningTime="2026-03-20 10:59:50.096575424 +0000 UTC m=+314.317936322" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.117340 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r7ckk" podStartSLOduration=5.899908142 podStartE2EDuration="1m20.117309631s" podCreationTimestamp="2026-03-20 10:58:30 +0000 UTC" firstStartedPulling="2026-03-20 10:58:32.998454569 +0000 UTC m=+237.219815467" lastFinishedPulling="2026-03-20 10:59:47.215856058 +0000 UTC m=+311.437216956" observedRunningTime="2026-03-20 10:59:50.11301918 +0000 UTC m=+314.334380078" watchObservedRunningTime="2026-03-20 10:59:50.117309631 +0000 UTC m=+314.338670529" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.545351 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6656bbf56-bpd7h" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.570710 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jx27x" podStartSLOduration=4.569617128 podStartE2EDuration="1m17.570684273s" podCreationTimestamp="2026-03-20 10:58:33 +0000 UTC" firstStartedPulling="2026-03-20 10:58:35.082815838 +0000 UTC m=+239.304176736" lastFinishedPulling="2026-03-20 10:59:48.083882983 +0000 UTC m=+312.305243881" observedRunningTime="2026-03-20 10:59:50.141210669 +0000 UTC m=+314.362571567" watchObservedRunningTime="2026-03-20 10:59:50.570684273 +0000 UTC m=+314.792045171" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.764425 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.764489 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:59:50 crc kubenswrapper[4860]: I0320 10:59:50.822708 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:59:51 crc kubenswrapper[4860]: I0320 10:59:51.093732 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n79b7" Mar 20 10:59:51 crc kubenswrapper[4860]: I0320 10:59:51.167432 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:59:51 crc kubenswrapper[4860]: I0320 10:59:51.167484 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 10:59:52 crc kubenswrapper[4860]: I0320 10:59:52.213491 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-r7ckk" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="registry-server" probeResult="failure" output=< Mar 20 10:59:52 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 10:59:52 crc kubenswrapper[4860]: > Mar 20 10:59:52 crc kubenswrapper[4860]: I0320 10:59:52.863585 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:59:52 crc kubenswrapper[4860]: I0320 10:59:52.864051 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:59:52 crc kubenswrapper[4860]: I0320 10:59:52.902086 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:59:53 crc kubenswrapper[4860]: I0320 10:59:53.096288 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:59:53 crc kubenswrapper[4860]: I0320 10:59:53.660984 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:59:53 crc kubenswrapper[4860]: I0320 10:59:53.662277 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 10:59:54 crc kubenswrapper[4860]: I0320 10:59:54.079006 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:59:54 crc kubenswrapper[4860]: I0320 10:59:54.079456 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 10:59:54 crc kubenswrapper[4860]: I0320 10:59:54.699412 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qq8bh" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="registry-server" probeResult="failure" output=< Mar 20 10:59:54 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 10:59:54 crc kubenswrapper[4860]: > Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.047983 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jpww"] Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.074890 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5jpww" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="registry-server" containerID="cri-o://e9d5be622305f211c3e12e12dfe924fbcb59aee9d65c271786e393a4e06cec5a" gracePeriod=2 Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.117413 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jx27x" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="registry-server" probeResult="failure" output=< Mar 20 10:59:55 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 10:59:55 crc kubenswrapper[4860]: > Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.995181 4860 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.996606 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.996902 4860 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.997483 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433" gracePeriod=15 Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.997489 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14" gracePeriod=15 Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.997542 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392" gracePeriod=15 Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.997554 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e" gracePeriod=15 Mar 20 10:59:55 crc kubenswrapper[4860]: I0320 10:59:55.997500 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017" gracePeriod=15 Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999307 4860 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999481 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999501 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999515 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999525 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999534 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999541 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999555 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999562 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999572 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999581 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999596 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999604 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999614 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999622 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999634 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999641 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999779 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999807 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999815 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999821 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999830 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999839 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999849 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:55.999859 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:55.999999 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.000014 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.000159 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:56.000328 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.000339 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.092498 4860 generic.go:334] "Generic (PLEG): container finished" podID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerID="e9d5be622305f211c3e12e12dfe924fbcb59aee9d65c271786e393a4e06cec5a" exitCode=0 Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.092547 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpww" event={"ID":"2268b7ae-c1db-4ef4-8236-60f7cfa277a1","Type":"ContainerDied","Data":"e9d5be622305f211c3e12e12dfe924fbcb59aee9d65c271786e393a4e06cec5a"} Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.092577 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jpww" event={"ID":"2268b7ae-c1db-4ef4-8236-60f7cfa277a1","Type":"ContainerDied","Data":"5f7c8a0760c1233f7a673fb0037c3446fe619e19acbc1953810d2c42b3db815b"} Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.092588 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f7c8a0760c1233f7a673fb0037c3446fe619e19acbc1953810d2c42b3db815b" Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:56.098574 4860 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.105421 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.106004 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.106170 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130094 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130153 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130174 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130195 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130247 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130325 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130355 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.130384 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.231636 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rln5j\" (UniqueName: \"kubernetes.io/projected/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-kube-api-access-rln5j\") pod \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.231782 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-utilities\") pod \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.231816 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-catalog-content\") pod \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\" (UID: \"2268b7ae-c1db-4ef4-8236-60f7cfa277a1\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232151 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232193 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232240 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232291 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232335 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232375 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232402 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232438 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232483 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232438 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232448 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232418 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232577 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232639 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232721 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.232805 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.233351 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-utilities" (OuterVolumeSpecName: "utilities") pod "2268b7ae-c1db-4ef4-8236-60f7cfa277a1" (UID: "2268b7ae-c1db-4ef4-8236-60f7cfa277a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.237901 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-kube-api-access-rln5j" (OuterVolumeSpecName: "kube-api-access-rln5j") pod "2268b7ae-c1db-4ef4-8236-60f7cfa277a1" (UID: "2268b7ae-c1db-4ef4-8236-60f7cfa277a1"). InnerVolumeSpecName "kube-api-access-rln5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.255434 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2268b7ae-c1db-4ef4-8236-60f7cfa277a1" (UID: "2268b7ae-c1db-4ef4-8236-60f7cfa277a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.283876 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" containerName="oauth-openshift" containerID="cri-o://7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186" gracePeriod=15 Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.334880 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.334930 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.334941 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rln5j\" (UniqueName: \"kubernetes.io/projected/2268b7ae-c1db-4ef4-8236-60f7cfa277a1-kube-api-access-rln5j\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.399842 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:56 crc kubenswrapper[4860]: W0320 10:59:56.424258 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-b832830c7c00cbebd44efc75feed3999dcf7ab0de95aed014738763ca98518d7 WatchSource:0}: Error finding container b832830c7c00cbebd44efc75feed3999dcf7ab0de95aed014738763ca98518d7: Status 404 returned error can't find the container with id b832830c7c00cbebd44efc75feed3999dcf7ab0de95aed014738763ca98518d7 Mar 20 10:59:56 crc kubenswrapper[4860]: E0320 10:59:56.427966 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.201:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e8799770a1bd4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:59:56.427451348 +0000 UTC m=+320.648812246,LastTimestamp:2026-03-20 10:59:56.427451348 +0000 UTC m=+320.648812246,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.746487 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.747744 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.748331 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.748705 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842114 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-dir\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842273 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-login\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842293 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842311 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-service-ca\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842428 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-trusted-ca-bundle\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842504 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-ocp-branding-template\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842559 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-provider-selection\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842643 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-idp-0-file-data\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842705 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-error\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842776 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-session\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842823 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-cliconfig\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842900 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-router-certs\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.842960 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqnwc\" (UniqueName: \"kubernetes.io/projected/3587f3ba-577b-425a-adf5-336a8977dcc5-kube-api-access-rqnwc\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.843048 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-policies\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.843109 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-serving-cert\") pod \"3587f3ba-577b-425a-adf5-336a8977dcc5\" (UID: \"3587f3ba-577b-425a-adf5-336a8977dcc5\") " Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.843326 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.843378 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.843757 4860 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.843801 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.843831 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.844120 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.844788 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.846837 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.847341 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.847875 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.848137 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.848147 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3587f3ba-577b-425a-adf5-336a8977dcc5-kube-api-access-rqnwc" (OuterVolumeSpecName: "kube-api-access-rqnwc") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "kube-api-access-rqnwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.850107 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.850870 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.850950 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.851118 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3587f3ba-577b-425a-adf5-336a8977dcc5" (UID: "3587f3ba-577b-425a-adf5-336a8977dcc5"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945792 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945835 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945854 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945870 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945888 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945903 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945917 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945931 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945946 4860 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3587f3ba-577b-425a-adf5-336a8977dcc5-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945960 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqnwc\" (UniqueName: \"kubernetes.io/projected/3587f3ba-577b-425a-adf5-336a8977dcc5-kube-api-access-rqnwc\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:56 crc kubenswrapper[4860]: I0320 10:59:56.945972 4860 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3587f3ba-577b-425a-adf5-336a8977dcc5-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.103407 4860 generic.go:334] "Generic (PLEG): container finished" podID="3587f3ba-577b-425a-adf5-336a8977dcc5" containerID="7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186" exitCode=0 Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.103528 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" event={"ID":"3587f3ba-577b-425a-adf5-336a8977dcc5","Type":"ContainerDied","Data":"7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186"} Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.103573 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" event={"ID":"3587f3ba-577b-425a-adf5-336a8977dcc5","Type":"ContainerDied","Data":"366c71d2561bff010f4d5dff91d7764636b34e8d53c1f0235c50a2b7eb65710b"} Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.103602 4860 scope.go:117] "RemoveContainer" containerID="7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.104328 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.106156 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.106647 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.107197 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.107588 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.109437 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.110312 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14" exitCode=0 Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.110337 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392" exitCode=0 Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.110347 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017" exitCode=0 Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.110359 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e" exitCode=2 Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.112564 4860 generic.go:334] "Generic (PLEG): container finished" podID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" containerID="8d39974009a23179ff960e42776dd6479d915e26394e49ae5752f3d385d29790" exitCode=0 Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.112636 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d","Type":"ContainerDied","Data":"8d39974009a23179ff960e42776dd6479d915e26394e49ae5752f3d385d29790"} Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.113513 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.113705 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.113849 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.113993 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.125023 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jpww" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.125307 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb"} Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.125367 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b832830c7c00cbebd44efc75feed3999dcf7ab0de95aed014738763ca98518d7"} Mar 20 10:59:57 crc kubenswrapper[4860]: E0320 10:59:57.127032 4860 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.128456 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.128831 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.129195 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.129858 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.130439 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.130827 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.131158 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.131616 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.132578 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.133123 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.133611 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.134010 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.135614 4860 scope.go:117] "RemoveContainer" containerID="7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186" Mar 20 10:59:57 crc kubenswrapper[4860]: E0320 10:59:57.138693 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186\": container with ID starting with 7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186 not found: ID does not exist" containerID="7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.138802 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186"} err="failed to get container status \"7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186\": rpc error: code = NotFound desc = could not find container \"7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186\": container with ID starting with 7c8b8128f311582fc677a42230767316cb90b5bf061457d3564d8b09eec2a186 not found: ID does not exist" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.138853 4860 scope.go:117] "RemoveContainer" containerID="b587466abe63358402650860c4e1013df80ee94ae2522c9485bc2286d0c3e4fa" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.139996 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.140192 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.140483 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.140862 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.419531 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.420628 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.421285 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:57 crc kubenswrapper[4860]: I0320 10:59:57.422048 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.147453 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.381925 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.383967 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.384625 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.385130 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.385325 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.385471 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.532075 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.532740 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.533023 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.533372 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.533644 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.570630 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.570775 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.571316 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.571551 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.571431 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.571591 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.572466 4860 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.572611 4860 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.572728 4860 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.674389 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-var-lock\") pod \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.674493 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kubelet-dir\") pod \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.674570 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kube-api-access\") pod \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\" (UID: \"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d\") " Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.674562 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-var-lock" (OuterVolumeSpecName: "var-lock") pod "f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" (UID: "f3e81cb0-739d-41be-b9fb-53a1c3de6a3d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.674645 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" (UID: "f3e81cb0-739d-41be-b9fb-53a1c3de6a3d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.674996 4860 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.675028 4860 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.680869 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" (UID: "f3e81cb0-739d-41be-b9fb-53a1c3de6a3d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:58 crc kubenswrapper[4860]: I0320 10:59:58.776953 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3e81cb0-739d-41be-b9fb-53a1c3de6a3d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.155705 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f3e81cb0-739d-41be-b9fb-53a1c3de6a3d","Type":"ContainerDied","Data":"e4aa9d22701e1f5fe237a916fc50c217011c513544d30da810d469bc44fe2386"} Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.155758 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4aa9d22701e1f5fe237a916fc50c217011c513544d30da810d469bc44fe2386" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.155762 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.161419 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.162570 4860 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433" exitCode=0 Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.162634 4860 scope.go:117] "RemoveContainer" containerID="2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.162699 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.169875 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.170662 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.171173 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.171440 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.177836 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.178046 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.178282 4860 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.178604 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.199545 4860 scope.go:117] "RemoveContainer" containerID="7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.218343 4860 scope.go:117] "RemoveContainer" containerID="59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.231818 4860 scope.go:117] "RemoveContainer" containerID="e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.251164 4860 scope.go:117] "RemoveContainer" containerID="b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.270986 4860 scope.go:117] "RemoveContainer" containerID="a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.294864 4860 scope.go:117] "RemoveContainer" containerID="2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14" Mar 20 10:59:59 crc kubenswrapper[4860]: E0320 10:59:59.295414 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\": container with ID starting with 2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14 not found: ID does not exist" containerID="2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.295545 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14"} err="failed to get container status \"2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\": rpc error: code = NotFound desc = could not find container \"2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14\": container with ID starting with 2c14e94ed75bdc67469f86002f56ed08fc3810f5a6316401c00a66d56d405e14 not found: ID does not exist" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.295641 4860 scope.go:117] "RemoveContainer" containerID="7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392" Mar 20 10:59:59 crc kubenswrapper[4860]: E0320 10:59:59.296215 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\": container with ID starting with 7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392 not found: ID does not exist" containerID="7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.296280 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392"} err="failed to get container status \"7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\": rpc error: code = NotFound desc = could not find container \"7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392\": container with ID starting with 7bd8cb6511f9ac06b2dec93ca815ba4cb62351751ef845c761e12b72c88aa392 not found: ID does not exist" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.296306 4860 scope.go:117] "RemoveContainer" containerID="59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017" Mar 20 10:59:59 crc kubenswrapper[4860]: E0320 10:59:59.296612 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\": container with ID starting with 59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017 not found: ID does not exist" containerID="59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.296640 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017"} err="failed to get container status \"59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\": rpc error: code = NotFound desc = could not find container \"59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017\": container with ID starting with 59ed071ee7bcd25da49c3ea7369d1d9eff5f3f3dc917cd5776687bb34bd62017 not found: ID does not exist" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.296658 4860 scope.go:117] "RemoveContainer" containerID="e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e" Mar 20 10:59:59 crc kubenswrapper[4860]: E0320 10:59:59.296998 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\": container with ID starting with e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e not found: ID does not exist" containerID="e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.297109 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e"} err="failed to get container status \"e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\": rpc error: code = NotFound desc = could not find container \"e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e\": container with ID starting with e2a078366a52b32335f49beb0e9f1ffcabf23fe1a66352d82b94e7aaf157617e not found: ID does not exist" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.297540 4860 scope.go:117] "RemoveContainer" containerID="b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433" Mar 20 10:59:59 crc kubenswrapper[4860]: E0320 10:59:59.298371 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\": container with ID starting with b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433 not found: ID does not exist" containerID="b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.298402 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433"} err="failed to get container status \"b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\": rpc error: code = NotFound desc = could not find container \"b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433\": container with ID starting with b738d17a8af76ed0884c2e4b3844310502e54b1a9784a9e7cbc27470da2b6433 not found: ID does not exist" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.298418 4860 scope.go:117] "RemoveContainer" containerID="a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4" Mar 20 10:59:59 crc kubenswrapper[4860]: E0320 10:59:59.298722 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\": container with ID starting with a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4 not found: ID does not exist" containerID="a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.298748 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4"} err="failed to get container status \"a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\": rpc error: code = NotFound desc = could not find container \"a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4\": container with ID starting with a09e43b45f2edbec51b3d3bc8f5277f0bc5a32585f4e44754544df8ec89f5ef4 not found: ID does not exist" Mar 20 10:59:59 crc kubenswrapper[4860]: I0320 10:59:59.437044 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 10:59:59 crc kubenswrapper[4860]: E0320 10:59:59.491049 4860 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" volumeName="registry-storage" Mar 20 11:00:00 crc kubenswrapper[4860]: E0320 11:00:00.964653 4860 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:00 crc kubenswrapper[4860]: E0320 11:00:00.965663 4860 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:00 crc kubenswrapper[4860]: E0320 11:00:00.965937 4860 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:00 crc kubenswrapper[4860]: E0320 11:00:00.966173 4860 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:00 crc kubenswrapper[4860]: E0320 11:00:00.966457 4860 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:00 crc kubenswrapper[4860]: I0320 11:00:00.966490 4860 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 11:00:00 crc kubenswrapper[4860]: E0320 11:00:00.966824 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="200ms" Mar 20 11:00:01 crc kubenswrapper[4860]: E0320 11:00:01.170163 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="400ms" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.207819 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.208577 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.208842 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.209155 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.209545 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.247241 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.247737 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.248158 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.248518 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: I0320 11:00:01.249029 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:01 crc kubenswrapper[4860]: E0320 11:00:01.571914 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="800ms" Mar 20 11:00:02 crc kubenswrapper[4860]: E0320 11:00:02.373024 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="1.6s" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.697593 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.698479 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.698918 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.699129 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.699347 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.699701 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.731763 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.732329 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.732546 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.732850 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.733411 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: I0320 11:00:03.733584 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:03 crc kubenswrapper[4860]: E0320 11:00:03.974704 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="3.2s" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.121966 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.122837 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.123184 4860 status_manager.go:851] "Failed to get status for pod" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" pod="openshift-marketplace/redhat-operators-jx27x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jx27x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.123701 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.124082 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.124407 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.124728 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.156509 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.157251 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.157627 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.158100 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.158359 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.158617 4860 status_manager.go:851] "Failed to get status for pod" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" pod="openshift-marketplace/redhat-operators-jx27x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jx27x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:04 crc kubenswrapper[4860]: I0320 11:00:04.158878 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:05 crc kubenswrapper[4860]: E0320 11:00:05.658164 4860 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.201:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e8799770a1bd4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:59:56.427451348 +0000 UTC m=+320.648812246,LastTimestamp:2026-03-20 10:59:56.427451348 +0000 UTC m=+320.648812246,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 11:00:07 crc kubenswrapper[4860]: E0320 11:00:07.175978 4860 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="6.4s" Mar 20 11:00:07 crc kubenswrapper[4860]: I0320 11:00:07.417334 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:07 crc kubenswrapper[4860]: I0320 11:00:07.417940 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:07 crc kubenswrapper[4860]: I0320 11:00:07.418448 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:07 crc kubenswrapper[4860]: I0320 11:00:07.418853 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:07 crc kubenswrapper[4860]: I0320 11:00:07.419283 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:07 crc kubenswrapper[4860]: I0320 11:00:07.419548 4860 status_manager.go:851] "Failed to get status for pod" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" pod="openshift-marketplace/redhat-operators-jx27x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jx27x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.241713 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.243970 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.244058 4860 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6bf4a38879e8e3c687bd3c57a3c68a29ad9a9e609ea0cbd220493b6ee4e7d9a3" exitCode=1 Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.244116 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6bf4a38879e8e3c687bd3c57a3c68a29ad9a9e609ea0cbd220493b6ee4e7d9a3"} Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.244914 4860 scope.go:117] "RemoveContainer" containerID="6bf4a38879e8e3c687bd3c57a3c68a29ad9a9e609ea0cbd220493b6ee4e7d9a3" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.245294 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.245830 4860 status_manager.go:851] "Failed to get status for pod" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" pod="openshift-marketplace/redhat-operators-jx27x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jx27x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.246290 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.246856 4860 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.247128 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.247527 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.247841 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.366478 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.412421 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.413579 4860 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.414119 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.414759 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.415288 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.415651 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.415929 4860 status_manager.go:851] "Failed to get status for pod" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" pod="openshift-marketplace/redhat-operators-jx27x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jx27x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.416363 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.441064 4860 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.441089 4860 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:10 crc kubenswrapper[4860]: E0320 11:00:10.441418 4860 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:10 crc kubenswrapper[4860]: I0320 11:00:10.441835 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:10 crc kubenswrapper[4860]: W0320 11:00:10.459634 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-2ced920ad0ddef898531e2af9377d7dd911b48390d6b20b59b6c428c03cff84f WatchSource:0}: Error finding container 2ced920ad0ddef898531e2af9377d7dd911b48390d6b20b59b6c428c03cff84f: Status 404 returned error can't find the container with id 2ced920ad0ddef898531e2af9377d7dd911b48390d6b20b59b6c428c03cff84f Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.256353 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.259094 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.259212 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ee792746dcee736a8a92bd7604013dd0c65310e6c461c9c93a8ab79ab26208fd"} Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.260793 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.261073 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.261569 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.261875 4860 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="783124f64419143d0101e576dce78700db95db48469864c2940162caae521b15" exitCode=0 Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.261907 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"783124f64419143d0101e576dce78700db95db48469864c2940162caae521b15"} Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.261925 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2ced920ad0ddef898531e2af9377d7dd911b48390d6b20b59b6c428c03cff84f"} Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.262091 4860 status_manager.go:851] "Failed to get status for pod" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" pod="openshift-marketplace/redhat-operators-jx27x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jx27x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.262198 4860 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.262218 4860 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.262428 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: E0320 11:00:11.262509 4860 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.262886 4860 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.263469 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.263881 4860 status_manager.go:851] "Failed to get status for pod" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" pod="openshift-marketplace/redhat-operators-jx27x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jx27x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.264097 4860 status_manager.go:851] "Failed to get status for pod" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.264306 4860 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.264552 4860 status_manager.go:851] "Failed to get status for pod" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" pod="openshift-authentication/oauth-openshift-558db77b4-srz5x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-srz5x\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.264856 4860 status_manager.go:851] "Failed to get status for pod" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" pod="openshift-marketplace/redhat-operators-qq8bh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qq8bh\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.265153 4860 status_manager.go:851] "Failed to get status for pod" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" pod="openshift-marketplace/community-operators-r7ckk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-r7ckk\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4860]: I0320 11:00:11.265412 4860 status_manager.go:851] "Failed to get status for pod" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" pod="openshift-marketplace/redhat-marketplace-5jpww" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5jpww\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4860]: I0320 11:00:12.292824 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f7c77282f274e0123893235d280f7fbcdff5c0ff3e6ac1721888c21de20119ce"} Mar 20 11:00:12 crc kubenswrapper[4860]: I0320 11:00:12.293269 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c8fb6b49c461b4613d59d87fe47cecaa238e1f4a8609211f65cccb46de6ca211"} Mar 20 11:00:12 crc kubenswrapper[4860]: I0320 11:00:12.293286 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fb29e37c0c3ac7d25e540a15f333f21f7de994198712f6ad16c75dbd9d4a97dc"} Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.303076 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"63cf051bc188fb176018b83ebc88fa9d85730fe6418fdf7fd49c1e07a97c91d6"} Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.303530 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.303548 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"37e66cee46ff35dd813a124de744e22432e2a3b9bced1ce2fbc404db4daa6386"} Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.304522 4860 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.304662 4860 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.438318 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.438608 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 11:00:13 crc kubenswrapper[4860]: I0320 11:00:13.438665 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 11:00:14 crc kubenswrapper[4860]: I0320 11:00:14.442854 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:00:14 crc kubenswrapper[4860]: I0320 11:00:14.442907 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:00:14 crc kubenswrapper[4860]: I0320 11:00:14.442925 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:00:14 crc kubenswrapper[4860]: I0320 11:00:14.442966 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 11:00:15 crc kubenswrapper[4860]: I0320 11:00:15.443114 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:15 crc kubenswrapper[4860]: I0320 11:00:15.443732 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.443259 4860 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.443803 4860 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.443851 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 11:02:17.443817694 +0000 UTC m=+461.665178592 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.443291 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.443670 4860 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.443940 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 11:02:17.443904256 +0000 UTC m=+461.665265194 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: I0320 11:00:15.447330 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 11:00:15 crc kubenswrapper[4860]: I0320 11:00:15.449889 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.454435 4860 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.454463 4860 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.454524 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 11:02:17.454505175 +0000 UTC m=+461.675866073 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:15 crc kubenswrapper[4860]: E0320 11:00:15.454547 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 11:02:17.454538566 +0000 UTC m=+461.675899674 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 20 11:00:18 crc kubenswrapper[4860]: I0320 11:00:18.320606 4860 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:18 crc kubenswrapper[4860]: I0320 11:00:18.510707 4860 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8d2e5ccd-eb92-4cfd-95cc-574044b7a8cd" Mar 20 11:00:19 crc kubenswrapper[4860]: I0320 11:00:19.342101 4860 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:19 crc kubenswrapper[4860]: I0320 11:00:19.342153 4860 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="26618d38-6c86-4f4d-84d0-33bd5a64ca4a" Mar 20 11:00:19 crc kubenswrapper[4860]: I0320 11:00:19.346553 4860 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8d2e5ccd-eb92-4cfd-95cc-574044b7a8cd" Mar 20 11:00:19 crc kubenswrapper[4860]: I0320 11:00:19.447190 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 11:00:19 crc kubenswrapper[4860]: I0320 11:00:19.448423 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 11:00:19 crc kubenswrapper[4860]: I0320 11:00:19.448788 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 11:00:20 crc kubenswrapper[4860]: I0320 11:00:20.366677 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:00:21 crc kubenswrapper[4860]: E0320 11:00:21.441887 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 11:00:21 crc kubenswrapper[4860]: E0320 11:00:21.449318 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 11:00:22 crc kubenswrapper[4860]: E0320 11:00:22.449968 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 11:00:23 crc kubenswrapper[4860]: I0320 11:00:23.439089 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 11:00:23 crc kubenswrapper[4860]: I0320 11:00:23.439646 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 11:00:27 crc kubenswrapper[4860]: I0320 11:00:27.692339 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 11:00:28 crc kubenswrapper[4860]: I0320 11:00:28.547721 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 11:00:28 crc kubenswrapper[4860]: I0320 11:00:28.564354 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 11:00:28 crc kubenswrapper[4860]: I0320 11:00:28.766909 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 11:00:28 crc kubenswrapper[4860]: I0320 11:00:28.859152 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 11:00:28 crc kubenswrapper[4860]: I0320 11:00:28.982062 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 11:00:29 crc kubenswrapper[4860]: I0320 11:00:29.136419 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 11:00:29 crc kubenswrapper[4860]: I0320 11:00:29.148171 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 11:00:29 crc kubenswrapper[4860]: I0320 11:00:29.227590 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 11:00:29 crc kubenswrapper[4860]: I0320 11:00:29.267518 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 11:00:29 crc kubenswrapper[4860]: I0320 11:00:29.280462 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 11:00:29 crc kubenswrapper[4860]: I0320 11:00:29.403641 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 11:00:29 crc kubenswrapper[4860]: I0320 11:00:29.899057 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.173450 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.327279 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.350393 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.428295 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.576737 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.600917 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.646825 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.706445 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.764261 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.964185 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 11:00:30 crc kubenswrapper[4860]: I0320 11:00:30.997830 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.021800 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.043394 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.460579 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.476701 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.521050 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.644210 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.773263 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.822647 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.882420 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.886429 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.934345 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 11:00:31 crc kubenswrapper[4860]: I0320 11:00:31.986423 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.066206 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.148495 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.301030 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.333403 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.358737 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.446927 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.503947 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.519598 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.519819 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.525336 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.526428 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.577997 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.595770 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.597101 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.668638 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.902112 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.904925 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.908749 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 11:00:32 crc kubenswrapper[4860]: I0320 11:00:32.997162 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.074906 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.215528 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.225170 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.277697 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.438414 4860 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.438478 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.438537 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.439119 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"ee792746dcee736a8a92bd7604013dd0c65310e6c461c9c93a8ab79ab26208fd"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.439259 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://ee792746dcee736a8a92bd7604013dd0c65310e6c461c9c93a8ab79ab26208fd" gracePeriod=30 Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.460039 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.568410 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.650336 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.658292 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.688895 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.709400 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.711946 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.719493 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.726069 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.751655 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.776098 4860 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.781719 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.790016 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.797095 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.799063 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.851569 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.857765 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.874870 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.912553 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.929862 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.943074 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.959190 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.968315 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.981700 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 11:00:33 crc kubenswrapper[4860]: I0320 11:00:33.991535 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.036500 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.108886 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.401299 4860 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.412978 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.418799 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.419662 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.451612 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.549986 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.582325 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.610665 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.614961 4860 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.657160 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.712983 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.726525 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.778685 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.945570 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 11:00:34 crc kubenswrapper[4860]: I0320 11:00:34.947878 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.038179 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.059774 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.102443 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.211584 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.215704 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.263956 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.334852 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.396269 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.412888 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.538526 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.563436 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.666408 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.702012 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.831965 4860 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.840006 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-marketplace/redhat-marketplace-5jpww","openshift-authentication/oauth-openshift-558db77b4-srz5x"] Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.840081 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.845068 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.848253 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.865829 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.865801612 podStartE2EDuration="17.865801612s" podCreationTimestamp="2026-03-20 11:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:00:35.861045397 +0000 UTC m=+360.082406305" watchObservedRunningTime="2026-03-20 11:00:35.865801612 +0000 UTC m=+360.087162510" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.918710 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.951483 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 11:00:35 crc kubenswrapper[4860]: I0320 11:00:35.957095 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.019124 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.036076 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.060056 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.084604 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.142137 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.323505 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.461069 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.476942 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.523265 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.538519 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.615130 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.651595 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.750757 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.755304 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.836803 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.853461 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.909817 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 11:00:36 crc kubenswrapper[4860]: I0320 11:00:36.922507 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.039555 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.057372 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.088527 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.096454 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.108329 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.128001 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.164330 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.204724 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.248253 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.346679 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.391065 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.406141 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.413367 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.420904 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" path="/var/lib/kubelet/pods/2268b7ae-c1db-4ef4-8236-60f7cfa277a1/volumes" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.422258 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" path="/var/lib/kubelet/pods/3587f3ba-577b-425a-adf5-336a8977dcc5/volumes" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.590560 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.636662 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.695071 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.760142 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.840055 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.959852 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 11:00:37 crc kubenswrapper[4860]: I0320 11:00:37.993571 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.076269 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.082967 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.096105 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.123523 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.152634 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.232272 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.288402 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.300917 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.348862 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.618805 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.700347 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.720608 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.739110 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.741948 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.887761 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.924981 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 11:00:38 crc kubenswrapper[4860]: I0320 11:00:38.937780 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.011488 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.102656 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.188366 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.280723 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.288137 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.337775 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.373088 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.381343 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.467455 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.494454 4860 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.531481 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.549363 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.571598 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.608334 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.665521 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.766355 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.771008 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.783899 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.899439 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 11:00:39 crc kubenswrapper[4860]: I0320 11:00:39.905167 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.095474 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.103858 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.138952 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.145907 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.164770 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.166770 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.196600 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.442511 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.733547 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.784679 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.818086 4860 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.818413 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb" gracePeriod=5 Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.826894 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 11:00:40 crc kubenswrapper[4860]: I0320 11:00:40.980800 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.025495 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.054215 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.090160 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.128773 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.221098 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.300024 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.355940 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.376511 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.409434 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.537116 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.665879 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.672419 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.721019 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.769375 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 11:00:41 crc kubenswrapper[4860]: I0320 11:00:41.934454 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.167253 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.353958 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.485866 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.603150 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.720253 4860 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.767153 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.816080 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.857706 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.926550 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 11:00:42 crc kubenswrapper[4860]: I0320 11:00:42.960760 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 11:00:43 crc kubenswrapper[4860]: I0320 11:00:43.014793 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 11:00:43 crc kubenswrapper[4860]: I0320 11:00:43.132025 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 11:00:43 crc kubenswrapper[4860]: I0320 11:00:43.233938 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 11:00:43 crc kubenswrapper[4860]: I0320 11:00:43.665287 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.493654 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts"] Mar 20 11:00:44 crc kubenswrapper[4860]: E0320 11:00:44.493898 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.493912 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 11:00:44 crc kubenswrapper[4860]: E0320 11:00:44.493923 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="extract-utilities" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.493930 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="extract-utilities" Mar 20 11:00:44 crc kubenswrapper[4860]: E0320 11:00:44.493949 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" containerName="installer" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.493956 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" containerName="installer" Mar 20 11:00:44 crc kubenswrapper[4860]: E0320 11:00:44.493965 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="extract-content" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.493970 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="extract-content" Mar 20 11:00:44 crc kubenswrapper[4860]: E0320 11:00:44.493979 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="registry-server" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.493985 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="registry-server" Mar 20 11:00:44 crc kubenswrapper[4860]: E0320 11:00:44.493993 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" containerName="oauth-openshift" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.494000 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" containerName="oauth-openshift" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.494088 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="3587f3ba-577b-425a-adf5-336a8977dcc5" containerName="oauth-openshift" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.494099 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="2268b7ae-c1db-4ef4-8236-60f7cfa277a1" containerName="registry-server" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.494112 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3e81cb0-739d-41be-b9fb-53a1c3de6a3d" containerName="installer" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.494118 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.494536 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.496719 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.496941 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.497770 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.498646 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.498829 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.498986 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.499130 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.503211 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.505309 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.511171 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts"] Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.511347 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.511539 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.511659 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514277 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514348 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514377 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514404 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5bgm\" (UniqueName: \"kubernetes.io/projected/0f06028e-1b3c-4890-857c-4f45971b09e2-kube-api-access-k5bgm\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514433 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-router-certs\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514464 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514490 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f06028e-1b3c-4890-857c-4f45971b09e2-audit-dir\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514528 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514568 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-login\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514616 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-audit-policies\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514646 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-error\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514725 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-service-ca\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514763 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-session\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.514796 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.517651 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.518838 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.533147 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.616304 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.616367 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f06028e-1b3c-4890-857c-4f45971b09e2-audit-dir\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.616412 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.616439 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-login\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.616535 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f06028e-1b3c-4890-857c-4f45971b09e2-audit-dir\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.617730 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-audit-policies\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.617754 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-audit-policies\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.617942 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-error\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.618057 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-service-ca\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.618096 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-session\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.618123 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.618181 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.632476 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.632542 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.632589 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bgm\" (UniqueName: \"kubernetes.io/projected/0f06028e-1b3c-4890-857c-4f45971b09e2-kube-api-access-k5bgm\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.632652 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-router-certs\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.633298 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-login\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.635966 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-error\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.636260 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-session\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.636707 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-service-ca\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.636749 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.637899 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.641445 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.641545 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.641974 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-router-certs\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.642902 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.643104 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f06028e-1b3c-4890-857c-4f45971b09e2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.659834 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5bgm\" (UniqueName: \"kubernetes.io/projected/0f06028e-1b3c-4890-857c-4f45971b09e2-kube-api-access-k5bgm\") pod \"oauth-openshift-6f8f59f8d9-5xxts\" (UID: \"0f06028e-1b3c-4890-857c-4f45971b09e2\") " pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.823184 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:44 crc kubenswrapper[4860]: I0320 11:00:44.932238 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.153261 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.228619 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts"] Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.499663 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" event={"ID":"0f06028e-1b3c-4890-857c-4f45971b09e2","Type":"ContainerStarted","Data":"c4b23f985c72e603416bf3ef53615b0f51cd9bb142bf92c48b86f3e167cc39e6"} Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.500957 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.500979 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" event={"ID":"0f06028e-1b3c-4890-857c-4f45971b09e2","Type":"ContainerStarted","Data":"1f2bbffe8baa0b74a42d4a12f23036f4ae087c8ae5f3f27235e6d19fe322ae6b"} Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.502487 4860 patch_prober.go:28] interesting pod/oauth-openshift-6f8f59f8d9-5xxts container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.67:6443/healthz\": dial tcp 10.217.0.67:6443: connect: connection refused" start-of-body= Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.502534 4860 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" podUID="0f06028e-1b3c-4890-857c-4f45971b09e2" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.67:6443/healthz\": dial tcp 10.217.0.67:6443: connect: connection refused" Mar 20 11:00:45 crc kubenswrapper[4860]: I0320 11:00:45.517309 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" podStartSLOduration=74.517284842 podStartE2EDuration="1m14.517284842s" podCreationTimestamp="2026-03-20 10:59:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:00:45.515746738 +0000 UTC m=+369.737107636" watchObservedRunningTime="2026-03-20 11:00:45.517284842 +0000 UTC m=+369.738645740" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.398781 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.398904 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.460539 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.460621 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.460793 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.460839 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.460925 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.460918 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.460825 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.461046 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.461142 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.461687 4860 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.461713 4860 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.461732 4860 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.461748 4860 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.472319 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.509111 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.510461 4860 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb" exitCode=137 Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.510538 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.510567 4860 scope.go:117] "RemoveContainer" containerID="6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.517440 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6f8f59f8d9-5xxts" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.540378 4860 scope.go:117] "RemoveContainer" containerID="6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb" Mar 20 11:00:46 crc kubenswrapper[4860]: E0320 11:00:46.541635 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb\": container with ID starting with 6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb not found: ID does not exist" containerID="6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.541677 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb"} err="failed to get container status \"6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb\": rpc error: code = NotFound desc = could not find container \"6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb\": container with ID starting with 6f9fcf8849363a86240c3e522164d76994e316f4d215b2b524d3e54d9f3d5cbb not found: ID does not exist" Mar 20 11:00:46 crc kubenswrapper[4860]: I0320 11:00:46.564049 4860 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:47 crc kubenswrapper[4860]: I0320 11:00:47.421858 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 11:01:01 crc kubenswrapper[4860]: I0320 11:01:01.608601 4860 generic.go:334] "Generic (PLEG): container finished" podID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerID="857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9" exitCode=0 Mar 20 11:01:01 crc kubenswrapper[4860]: I0320 11:01:01.608706 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" event={"ID":"403ca5f6-bd52-40de-88d6-5151b3202c76","Type":"ContainerDied","Data":"857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9"} Mar 20 11:01:01 crc kubenswrapper[4860]: I0320 11:01:01.609822 4860 scope.go:117] "RemoveContainer" containerID="857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9" Mar 20 11:01:02 crc kubenswrapper[4860]: I0320 11:01:02.615947 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" event={"ID":"403ca5f6-bd52-40de-88d6-5151b3202c76","Type":"ContainerStarted","Data":"71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143"} Mar 20 11:01:02 crc kubenswrapper[4860]: I0320 11:01:02.616754 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 11:01:02 crc kubenswrapper[4860]: I0320 11:01:02.618559 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 11:01:03 crc kubenswrapper[4860]: I0320 11:01:03.623762 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 20 11:01:03 crc kubenswrapper[4860]: I0320 11:01:03.624661 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 11:01:03 crc kubenswrapper[4860]: I0320 11:01:03.626201 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 11:01:03 crc kubenswrapper[4860]: I0320 11:01:03.626270 4860 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ee792746dcee736a8a92bd7604013dd0c65310e6c461c9c93a8ab79ab26208fd" exitCode=137 Mar 20 11:01:03 crc kubenswrapper[4860]: I0320 11:01:03.626394 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ee792746dcee736a8a92bd7604013dd0c65310e6c461c9c93a8ab79ab26208fd"} Mar 20 11:01:03 crc kubenswrapper[4860]: I0320 11:01:03.626499 4860 scope.go:117] "RemoveContainer" containerID="6bf4a38879e8e3c687bd3c57a3c68a29ad9a9e609ea0cbd220493b6ee4e7d9a3" Mar 20 11:01:04 crc kubenswrapper[4860]: I0320 11:01:04.635866 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 20 11:01:04 crc kubenswrapper[4860]: I0320 11:01:04.637265 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 11:01:04 crc kubenswrapper[4860]: I0320 11:01:04.638434 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c10a79b01ed3c1a1c40f224371e5b1990c85d54c1c4aec9018ecd4245f9d69fc"} Mar 20 11:01:10 crc kubenswrapper[4860]: I0320 11:01:10.366665 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:01:13 crc kubenswrapper[4860]: I0320 11:01:13.438007 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:01:13 crc kubenswrapper[4860]: I0320 11:01:13.444488 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:01:20 crc kubenswrapper[4860]: I0320 11:01:20.370515 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.178350 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq"] Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.179748 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.182493 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.182751 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.193771 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566740-26bw9"] Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.215698 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-26bw9" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.234788 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.237087 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.241708 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.244545 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq"] Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.255475 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-26bw9"] Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.309764 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv5sr\" (UniqueName: \"kubernetes.io/projected/41a09ead-8137-4791-896c-c5a9cad7f4cf-kube-api-access-sv5sr\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.309860 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41a09ead-8137-4791-896c-c5a9cad7f4cf-secret-volume\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.309919 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41a09ead-8137-4791-896c-c5a9cad7f4cf-config-volume\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.309947 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlzmr\" (UniqueName: \"kubernetes.io/projected/b31d1240-ea69-4da9-9a40-70f252222d4d-kube-api-access-mlzmr\") pod \"auto-csr-approver-29566740-26bw9\" (UID: \"b31d1240-ea69-4da9-9a40-70f252222d4d\") " pod="openshift-infra/auto-csr-approver-29566740-26bw9" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.411083 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv5sr\" (UniqueName: \"kubernetes.io/projected/41a09ead-8137-4791-896c-c5a9cad7f4cf-kube-api-access-sv5sr\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.411144 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41a09ead-8137-4791-896c-c5a9cad7f4cf-secret-volume\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.411173 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41a09ead-8137-4791-896c-c5a9cad7f4cf-config-volume\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.411207 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlzmr\" (UniqueName: \"kubernetes.io/projected/b31d1240-ea69-4da9-9a40-70f252222d4d-kube-api-access-mlzmr\") pod \"auto-csr-approver-29566740-26bw9\" (UID: \"b31d1240-ea69-4da9-9a40-70f252222d4d\") " pod="openshift-infra/auto-csr-approver-29566740-26bw9" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.412578 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41a09ead-8137-4791-896c-c5a9cad7f4cf-config-volume\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.422931 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41a09ead-8137-4791-896c-c5a9cad7f4cf-secret-volume\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.434986 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv5sr\" (UniqueName: \"kubernetes.io/projected/41a09ead-8137-4791-896c-c5a9cad7f4cf-kube-api-access-sv5sr\") pod \"collect-profiles-29566740-627pq\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.447573 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlzmr\" (UniqueName: \"kubernetes.io/projected/b31d1240-ea69-4da9-9a40-70f252222d4d-kube-api-access-mlzmr\") pod \"auto-csr-approver-29566740-26bw9\" (UID: \"b31d1240-ea69-4da9-9a40-70f252222d4d\") " pod="openshift-infra/auto-csr-approver-29566740-26bw9" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.544043 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.560510 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-26bw9" Mar 20 11:01:21 crc kubenswrapper[4860]: I0320 11:01:21.966652 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-26bw9"] Mar 20 11:01:22 crc kubenswrapper[4860]: I0320 11:01:22.012606 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq"] Mar 20 11:01:22 crc kubenswrapper[4860]: W0320 11:01:22.017154 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41a09ead_8137_4791_896c_c5a9cad7f4cf.slice/crio-95cd01e08e4aa9d5bc8aab08fcfec2f0373c59ae07e50c96cd2b03534805f951 WatchSource:0}: Error finding container 95cd01e08e4aa9d5bc8aab08fcfec2f0373c59ae07e50c96cd2b03534805f951: Status 404 returned error can't find the container with id 95cd01e08e4aa9d5bc8aab08fcfec2f0373c59ae07e50c96cd2b03534805f951 Mar 20 11:01:22 crc kubenswrapper[4860]: I0320 11:01:22.768096 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566740-26bw9" event={"ID":"b31d1240-ea69-4da9-9a40-70f252222d4d","Type":"ContainerStarted","Data":"5a39a5b27057452598a534207e217f2291067df2b9f67d84a1fc26e76a94fa5a"} Mar 20 11:01:22 crc kubenswrapper[4860]: I0320 11:01:22.771900 4860 generic.go:334] "Generic (PLEG): container finished" podID="41a09ead-8137-4791-896c-c5a9cad7f4cf" containerID="728de8ccc22f402da25ca09407c17b66749c7ba40a4b7eb4c5cb707fe2325a9c" exitCode=0 Mar 20 11:01:22 crc kubenswrapper[4860]: I0320 11:01:22.771959 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" event={"ID":"41a09ead-8137-4791-896c-c5a9cad7f4cf","Type":"ContainerDied","Data":"728de8ccc22f402da25ca09407c17b66749c7ba40a4b7eb4c5cb707fe2325a9c"} Mar 20 11:01:22 crc kubenswrapper[4860]: I0320 11:01:22.772000 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" event={"ID":"41a09ead-8137-4791-896c-c5a9cad7f4cf","Type":"ContainerStarted","Data":"95cd01e08e4aa9d5bc8aab08fcfec2f0373c59ae07e50c96cd2b03534805f951"} Mar 20 11:01:23 crc kubenswrapper[4860]: I0320 11:01:23.778729 4860 generic.go:334] "Generic (PLEG): container finished" podID="b31d1240-ea69-4da9-9a40-70f252222d4d" containerID="e24c70c83330a3f76a9f16d28ca14d8e62ae9184fa806a34c1edf7e65a681362" exitCode=0 Mar 20 11:01:23 crc kubenswrapper[4860]: I0320 11:01:23.778785 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566740-26bw9" event={"ID":"b31d1240-ea69-4da9-9a40-70f252222d4d","Type":"ContainerDied","Data":"e24c70c83330a3f76a9f16d28ca14d8e62ae9184fa806a34c1edf7e65a681362"} Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.039510 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.153350 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41a09ead-8137-4791-896c-c5a9cad7f4cf-config-volume\") pod \"41a09ead-8137-4791-896c-c5a9cad7f4cf\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.153392 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41a09ead-8137-4791-896c-c5a9cad7f4cf-secret-volume\") pod \"41a09ead-8137-4791-896c-c5a9cad7f4cf\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.153512 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv5sr\" (UniqueName: \"kubernetes.io/projected/41a09ead-8137-4791-896c-c5a9cad7f4cf-kube-api-access-sv5sr\") pod \"41a09ead-8137-4791-896c-c5a9cad7f4cf\" (UID: \"41a09ead-8137-4791-896c-c5a9cad7f4cf\") " Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.154820 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41a09ead-8137-4791-896c-c5a9cad7f4cf-config-volume" (OuterVolumeSpecName: "config-volume") pod "41a09ead-8137-4791-896c-c5a9cad7f4cf" (UID: "41a09ead-8137-4791-896c-c5a9cad7f4cf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.163690 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41a09ead-8137-4791-896c-c5a9cad7f4cf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "41a09ead-8137-4791-896c-c5a9cad7f4cf" (UID: "41a09ead-8137-4791-896c-c5a9cad7f4cf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.167066 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a09ead-8137-4791-896c-c5a9cad7f4cf-kube-api-access-sv5sr" (OuterVolumeSpecName: "kube-api-access-sv5sr") pod "41a09ead-8137-4791-896c-c5a9cad7f4cf" (UID: "41a09ead-8137-4791-896c-c5a9cad7f4cf"). InnerVolumeSpecName "kube-api-access-sv5sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.254603 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv5sr\" (UniqueName: \"kubernetes.io/projected/41a09ead-8137-4791-896c-c5a9cad7f4cf-kube-api-access-sv5sr\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.254642 4860 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41a09ead-8137-4791-896c-c5a9cad7f4cf-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.254652 4860 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41a09ead-8137-4791-896c-c5a9cad7f4cf-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.789069 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.789273 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq" event={"ID":"41a09ead-8137-4791-896c-c5a9cad7f4cf","Type":"ContainerDied","Data":"95cd01e08e4aa9d5bc8aab08fcfec2f0373c59ae07e50c96cd2b03534805f951"} Mar 20 11:01:24 crc kubenswrapper[4860]: I0320 11:01:24.793869 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95cd01e08e4aa9d5bc8aab08fcfec2f0373c59ae07e50c96cd2b03534805f951" Mar 20 11:01:25 crc kubenswrapper[4860]: I0320 11:01:25.032896 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-26bw9" Mar 20 11:01:25 crc kubenswrapper[4860]: I0320 11:01:25.177127 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlzmr\" (UniqueName: \"kubernetes.io/projected/b31d1240-ea69-4da9-9a40-70f252222d4d-kube-api-access-mlzmr\") pod \"b31d1240-ea69-4da9-9a40-70f252222d4d\" (UID: \"b31d1240-ea69-4da9-9a40-70f252222d4d\") " Mar 20 11:01:25 crc kubenswrapper[4860]: I0320 11:01:25.183427 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b31d1240-ea69-4da9-9a40-70f252222d4d-kube-api-access-mlzmr" (OuterVolumeSpecName: "kube-api-access-mlzmr") pod "b31d1240-ea69-4da9-9a40-70f252222d4d" (UID: "b31d1240-ea69-4da9-9a40-70f252222d4d"). InnerVolumeSpecName "kube-api-access-mlzmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:25 crc kubenswrapper[4860]: I0320 11:01:25.279579 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlzmr\" (UniqueName: \"kubernetes.io/projected/b31d1240-ea69-4da9-9a40-70f252222d4d-kube-api-access-mlzmr\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:25 crc kubenswrapper[4860]: I0320 11:01:25.796470 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-26bw9" Mar 20 11:01:25 crc kubenswrapper[4860]: I0320 11:01:25.796504 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566740-26bw9" event={"ID":"b31d1240-ea69-4da9-9a40-70f252222d4d","Type":"ContainerDied","Data":"5a39a5b27057452598a534207e217f2291067df2b9f67d84a1fc26e76a94fa5a"} Mar 20 11:01:25 crc kubenswrapper[4860]: I0320 11:01:25.796554 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a39a5b27057452598a534207e217f2291067df2b9f67d84a1fc26e76a94fa5a" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.364431 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r7ckk"] Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.365339 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r7ckk" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="registry-server" containerID="cri-o://1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4" gracePeriod=2 Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.570180 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jx27x"] Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.570573 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jx27x" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="registry-server" containerID="cri-o://19e5b54cfca0bde6cf7f8393230c7ed69a1c9b30a1db1b83c9b87021655f765e" gracePeriod=2 Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.741784 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.838484 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-utilities\") pod \"f0e14a08-824b-450f-bf98-2a476da0d44b\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.838959 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-catalog-content\") pod \"f0e14a08-824b-450f-bf98-2a476da0d44b\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.839068 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhz7s\" (UniqueName: \"kubernetes.io/projected/f0e14a08-824b-450f-bf98-2a476da0d44b-kube-api-access-jhz7s\") pod \"f0e14a08-824b-450f-bf98-2a476da0d44b\" (UID: \"f0e14a08-824b-450f-bf98-2a476da0d44b\") " Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.839569 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-utilities" (OuterVolumeSpecName: "utilities") pod "f0e14a08-824b-450f-bf98-2a476da0d44b" (UID: "f0e14a08-824b-450f-bf98-2a476da0d44b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.844069 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0e14a08-824b-450f-bf98-2a476da0d44b-kube-api-access-jhz7s" (OuterVolumeSpecName: "kube-api-access-jhz7s") pod "f0e14a08-824b-450f-bf98-2a476da0d44b" (UID: "f0e14a08-824b-450f-bf98-2a476da0d44b"). InnerVolumeSpecName "kube-api-access-jhz7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.865121 4860 generic.go:334] "Generic (PLEG): container finished" podID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerID="19e5b54cfca0bde6cf7f8393230c7ed69a1c9b30a1db1b83c9b87021655f765e" exitCode=0 Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.865150 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerDied","Data":"19e5b54cfca0bde6cf7f8393230c7ed69a1c9b30a1db1b83c9b87021655f765e"} Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.868113 4860 generic.go:334] "Generic (PLEG): container finished" podID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerID="1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4" exitCode=0 Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.868172 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7ckk" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.868171 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7ckk" event={"ID":"f0e14a08-824b-450f-bf98-2a476da0d44b","Type":"ContainerDied","Data":"1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4"} Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.868325 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7ckk" event={"ID":"f0e14a08-824b-450f-bf98-2a476da0d44b","Type":"ContainerDied","Data":"711ef831caa70569060ba2dc068e9cede6a21ca93c6a666bf7abd4f4e2156736"} Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.868357 4860 scope.go:117] "RemoveContainer" containerID="1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.896616 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0e14a08-824b-450f-bf98-2a476da0d44b" (UID: "f0e14a08-824b-450f-bf98-2a476da0d44b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.897089 4860 scope.go:117] "RemoveContainer" containerID="a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.925511 4860 scope.go:117] "RemoveContainer" containerID="19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.938657 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.940315 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhz7s\" (UniqueName: \"kubernetes.io/projected/f0e14a08-824b-450f-bf98-2a476da0d44b-kube-api-access-jhz7s\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.940339 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.940349 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e14a08-824b-450f-bf98-2a476da0d44b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.946564 4860 scope.go:117] "RemoveContainer" containerID="1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4" Mar 20 11:01:35 crc kubenswrapper[4860]: E0320 11:01:35.947110 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4\": container with ID starting with 1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4 not found: ID does not exist" containerID="1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.947186 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4"} err="failed to get container status \"1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4\": rpc error: code = NotFound desc = could not find container \"1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4\": container with ID starting with 1aa3a53909a5c843b03ac48e95c43de617bf5c9f3bb63dc85380fa7fe7677bb4 not found: ID does not exist" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.947277 4860 scope.go:117] "RemoveContainer" containerID="a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6" Mar 20 11:01:35 crc kubenswrapper[4860]: E0320 11:01:35.947928 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6\": container with ID starting with a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6 not found: ID does not exist" containerID="a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.947976 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6"} err="failed to get container status \"a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6\": rpc error: code = NotFound desc = could not find container \"a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6\": container with ID starting with a5700f11fea1e3d3f4586b36f718ac04fc6f4838eaf0dec6f0868235a01313a6 not found: ID does not exist" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.948006 4860 scope.go:117] "RemoveContainer" containerID="19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2" Mar 20 11:01:35 crc kubenswrapper[4860]: E0320 11:01:35.948357 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2\": container with ID starting with 19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2 not found: ID does not exist" containerID="19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2" Mar 20 11:01:35 crc kubenswrapper[4860]: I0320 11:01:35.948378 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2"} err="failed to get container status \"19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2\": rpc error: code = NotFound desc = could not find container \"19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2\": container with ID starting with 19dec2e9725dfca97383712d1fced11c792963a7799dee6ec4f9f37020ff60b2 not found: ID does not exist" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.041655 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-catalog-content\") pod \"f81a43aa-2c39-4d49-8526-f097322dd7bf\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.041759 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-utilities\") pod \"f81a43aa-2c39-4d49-8526-f097322dd7bf\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.041794 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn9d7\" (UniqueName: \"kubernetes.io/projected/f81a43aa-2c39-4d49-8526-f097322dd7bf-kube-api-access-mn9d7\") pod \"f81a43aa-2c39-4d49-8526-f097322dd7bf\" (UID: \"f81a43aa-2c39-4d49-8526-f097322dd7bf\") " Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.043115 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-utilities" (OuterVolumeSpecName: "utilities") pod "f81a43aa-2c39-4d49-8526-f097322dd7bf" (UID: "f81a43aa-2c39-4d49-8526-f097322dd7bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.045495 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f81a43aa-2c39-4d49-8526-f097322dd7bf-kube-api-access-mn9d7" (OuterVolumeSpecName: "kube-api-access-mn9d7") pod "f81a43aa-2c39-4d49-8526-f097322dd7bf" (UID: "f81a43aa-2c39-4d49-8526-f097322dd7bf"). InnerVolumeSpecName "kube-api-access-mn9d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.146221 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.146404 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn9d7\" (UniqueName: \"kubernetes.io/projected/f81a43aa-2c39-4d49-8526-f097322dd7bf-kube-api-access-mn9d7\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.158314 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f81a43aa-2c39-4d49-8526-f097322dd7bf" (UID: "f81a43aa-2c39-4d49-8526-f097322dd7bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.200314 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r7ckk"] Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.204275 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r7ckk"] Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.247860 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f81a43aa-2c39-4d49-8526-f097322dd7bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.877247 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jx27x" event={"ID":"f81a43aa-2c39-4d49-8526-f097322dd7bf","Type":"ContainerDied","Data":"b681f56bdfe5d7107d46e455eb32b402c90f63f71364660357f8ecd488fda604"} Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.877305 4860 scope.go:117] "RemoveContainer" containerID="19e5b54cfca0bde6cf7f8393230c7ed69a1c9b30a1db1b83c9b87021655f765e" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.877343 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jx27x" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.907534 4860 scope.go:117] "RemoveContainer" containerID="5000df7098db21086cd500235d0dfe1cd2c6c277e01022d3498595c652a31046" Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.925601 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jx27x"] Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.932358 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jx27x"] Mar 20 11:01:36 crc kubenswrapper[4860]: I0320 11:01:36.940317 4860 scope.go:117] "RemoveContainer" containerID="cc28a9c4b1f826fc06b2b83281cd0a01bf1dc28b3e9617ab722a34ea90577dc6" Mar 20 11:01:37 crc kubenswrapper[4860]: I0320 11:01:37.424969 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" path="/var/lib/kubelet/pods/f0e14a08-824b-450f-bf98-2a476da0d44b/volumes" Mar 20 11:01:37 crc kubenswrapper[4860]: I0320 11:01:37.426669 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" path="/var/lib/kubelet/pods/f81a43aa-2c39-4d49-8526-f097322dd7bf/volumes" Mar 20 11:01:52 crc kubenswrapper[4860]: I0320 11:01:52.344269 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:01:52 crc kubenswrapper[4860]: I0320 11:01:52.344964 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.144736 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566742-tczkf"] Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.146417 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="registry-server" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.146497 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="registry-server" Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.146566 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a09ead-8137-4791-896c-c5a9cad7f4cf" containerName="collect-profiles" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.146623 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a09ead-8137-4791-896c-c5a9cad7f4cf" containerName="collect-profiles" Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.146688 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31d1240-ea69-4da9-9a40-70f252222d4d" containerName="oc" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.146750 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31d1240-ea69-4da9-9a40-70f252222d4d" containerName="oc" Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.146811 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="extract-content" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.146868 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="extract-content" Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.146925 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="extract-utilities" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.146979 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="extract-utilities" Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.147040 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="extract-utilities" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.147399 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="extract-utilities" Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.147480 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="extract-content" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.147551 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="extract-content" Mar 20 11:02:00 crc kubenswrapper[4860]: E0320 11:02:00.147620 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="registry-server" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.147698 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="registry-server" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.147878 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0e14a08-824b-450f-bf98-2a476da0d44b" containerName="registry-server" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.147949 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="41a09ead-8137-4791-896c-c5a9cad7f4cf" containerName="collect-profiles" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.148012 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31d1240-ea69-4da9-9a40-70f252222d4d" containerName="oc" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.148075 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f81a43aa-2c39-4d49-8526-f097322dd7bf" containerName="registry-server" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.148593 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-tczkf" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.151972 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.152843 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.158631 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-tczkf"] Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.158807 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.160402 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h48qf\" (UniqueName: \"kubernetes.io/projected/980f5756-4935-469d-933b-f4e339ded9a4-kube-api-access-h48qf\") pod \"auto-csr-approver-29566742-tczkf\" (UID: \"980f5756-4935-469d-933b-f4e339ded9a4\") " pod="openshift-infra/auto-csr-approver-29566742-tczkf" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.262579 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h48qf\" (UniqueName: \"kubernetes.io/projected/980f5756-4935-469d-933b-f4e339ded9a4-kube-api-access-h48qf\") pod \"auto-csr-approver-29566742-tczkf\" (UID: \"980f5756-4935-469d-933b-f4e339ded9a4\") " pod="openshift-infra/auto-csr-approver-29566742-tczkf" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.289939 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h48qf\" (UniqueName: \"kubernetes.io/projected/980f5756-4935-469d-933b-f4e339ded9a4-kube-api-access-h48qf\") pod \"auto-csr-approver-29566742-tczkf\" (UID: \"980f5756-4935-469d-933b-f4e339ded9a4\") " pod="openshift-infra/auto-csr-approver-29566742-tczkf" Mar 20 11:02:00 crc kubenswrapper[4860]: I0320 11:02:00.469843 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-tczkf" Mar 20 11:02:01 crc kubenswrapper[4860]: I0320 11:02:01.031833 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-tczkf"] Mar 20 11:02:02 crc kubenswrapper[4860]: I0320 11:02:02.032372 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-tczkf" event={"ID":"980f5756-4935-469d-933b-f4e339ded9a4","Type":"ContainerStarted","Data":"7daa785220c2781aa57144adb5ff002190bf0947817569818a884fa78dbf2974"} Mar 20 11:02:03 crc kubenswrapper[4860]: I0320 11:02:03.043191 4860 generic.go:334] "Generic (PLEG): container finished" podID="980f5756-4935-469d-933b-f4e339ded9a4" containerID="5e0b7b6725e58dc6c9517f6806ff8c8ba7c117d2ad076272e2c94e40ea777f46" exitCode=0 Mar 20 11:02:03 crc kubenswrapper[4860]: I0320 11:02:03.043932 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-tczkf" event={"ID":"980f5756-4935-469d-933b-f4e339ded9a4","Type":"ContainerDied","Data":"5e0b7b6725e58dc6c9517f6806ff8c8ba7c117d2ad076272e2c94e40ea777f46"} Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.157667 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5w95"] Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.159062 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w5w95" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="registry-server" containerID="cri-o://bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff" gracePeriod=30 Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.182585 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n79b7"] Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.182916 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n79b7" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="registry-server" containerID="cri-o://8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4" gracePeriod=30 Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.195187 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhgh4"] Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.195435 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" containerID="cri-o://71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143" gracePeriod=30 Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.216950 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xlp"] Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.217249 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d9xlp" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="registry-server" containerID="cri-o://3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f" gracePeriod=30 Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.227405 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qq8bh"] Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.227644 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qq8bh" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="registry-server" containerID="cri-o://dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958" gracePeriod=30 Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.232200 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qkfjv"] Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.233588 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.270790 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qkfjv"] Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.334093 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/489f9463-a47c-4635-aad3-866e47a2c97f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.334191 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/489f9463-a47c-4635-aad3-866e47a2c97f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.334283 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnpfl\" (UniqueName: \"kubernetes.io/projected/489f9463-a47c-4635-aad3-866e47a2c97f-kube-api-access-pnpfl\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.436931 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/489f9463-a47c-4635-aad3-866e47a2c97f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.437556 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnpfl\" (UniqueName: \"kubernetes.io/projected/489f9463-a47c-4635-aad3-866e47a2c97f-kube-api-access-pnpfl\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.437611 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/489f9463-a47c-4635-aad3-866e47a2c97f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.439510 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/489f9463-a47c-4635-aad3-866e47a2c97f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.448120 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/489f9463-a47c-4635-aad3-866e47a2c97f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.465048 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnpfl\" (UniqueName: \"kubernetes.io/projected/489f9463-a47c-4635-aad3-866e47a2c97f-kube-api-access-pnpfl\") pod \"marketplace-operator-79b997595-qkfjv\" (UID: \"489f9463-a47c-4635-aad3-866e47a2c97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.608890 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.619710 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-tczkf" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.642369 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h48qf\" (UniqueName: \"kubernetes.io/projected/980f5756-4935-469d-933b-f4e339ded9a4-kube-api-access-h48qf\") pod \"980f5756-4935-469d-933b-f4e339ded9a4\" (UID: \"980f5756-4935-469d-933b-f4e339ded9a4\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.647466 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/980f5756-4935-469d-933b-f4e339ded9a4-kube-api-access-h48qf" (OuterVolumeSpecName: "kube-api-access-h48qf") pod "980f5756-4935-469d-933b-f4e339ded9a4" (UID: "980f5756-4935-469d-933b-f4e339ded9a4"). InnerVolumeSpecName "kube-api-access-h48qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.691122 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n79b7" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.742786 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.743704 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-utilities\") pod \"d2690d8b-c7f7-4e71-af44-33444e4d6187\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.745202 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fks5\" (UniqueName: \"kubernetes.io/projected/d2690d8b-c7f7-4e71-af44-33444e4d6187-kube-api-access-6fks5\") pod \"d2690d8b-c7f7-4e71-af44-33444e4d6187\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.745536 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-catalog-content\") pod \"d2690d8b-c7f7-4e71-af44-33444e4d6187\" (UID: \"d2690d8b-c7f7-4e71-af44-33444e4d6187\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.745815 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h48qf\" (UniqueName: \"kubernetes.io/projected/980f5756-4935-469d-933b-f4e339ded9a4-kube-api-access-h48qf\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.746544 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-utilities" (OuterVolumeSpecName: "utilities") pod "d2690d8b-c7f7-4e71-af44-33444e4d6187" (UID: "d2690d8b-c7f7-4e71-af44-33444e4d6187"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.753618 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2690d8b-c7f7-4e71-af44-33444e4d6187-kube-api-access-6fks5" (OuterVolumeSpecName: "kube-api-access-6fks5") pod "d2690d8b-c7f7-4e71-af44-33444e4d6187" (UID: "d2690d8b-c7f7-4e71-af44-33444e4d6187"). InnerVolumeSpecName "kube-api-access-6fks5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.855126 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-catalog-content\") pod \"4f84f111-5991-4e78-9508-82283b8e36f7\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.855595 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h2z2\" (UniqueName: \"kubernetes.io/projected/4f84f111-5991-4e78-9508-82283b8e36f7-kube-api-access-4h2z2\") pod \"4f84f111-5991-4e78-9508-82283b8e36f7\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.855688 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-utilities\") pod \"4f84f111-5991-4e78-9508-82283b8e36f7\" (UID: \"4f84f111-5991-4e78-9508-82283b8e36f7\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.856120 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.856149 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fks5\" (UniqueName: \"kubernetes.io/projected/d2690d8b-c7f7-4e71-af44-33444e4d6187-kube-api-access-6fks5\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.857062 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-utilities" (OuterVolumeSpecName: "utilities") pod "4f84f111-5991-4e78-9508-82283b8e36f7" (UID: "4f84f111-5991-4e78-9508-82283b8e36f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.866415 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f84f111-5991-4e78-9508-82283b8e36f7-kube-api-access-4h2z2" (OuterVolumeSpecName: "kube-api-access-4h2z2") pod "4f84f111-5991-4e78-9508-82283b8e36f7" (UID: "4f84f111-5991-4e78-9508-82283b8e36f7"). InnerVolumeSpecName "kube-api-access-4h2z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.909395 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.937358 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.958570 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.977715 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-catalog-content\") pod \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.977788 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-utilities\") pod \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.977922 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdzkg\" (UniqueName: \"kubernetes.io/projected/514f05c3-1404-46c6-9f4d-68437ea8ee0b-kube-api-access-pdzkg\") pod \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\" (UID: \"514f05c3-1404-46c6-9f4d-68437ea8ee0b\") " Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.978205 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.978217 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h2z2\" (UniqueName: \"kubernetes.io/projected/4f84f111-5991-4e78-9508-82283b8e36f7-kube-api-access-4h2z2\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.980048 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-utilities" (OuterVolumeSpecName: "utilities") pod "514f05c3-1404-46c6-9f4d-68437ea8ee0b" (UID: "514f05c3-1404-46c6-9f4d-68437ea8ee0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.993000 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2690d8b-c7f7-4e71-af44-33444e4d6187" (UID: "d2690d8b-c7f7-4e71-af44-33444e4d6187"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4860]: I0320 11:02:04.997912 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f84f111-5991-4e78-9508-82283b8e36f7" (UID: "4f84f111-5991-4e78-9508-82283b8e36f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.009484 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514f05c3-1404-46c6-9f4d-68437ea8ee0b-kube-api-access-pdzkg" (OuterVolumeSpecName: "kube-api-access-pdzkg") pod "514f05c3-1404-46c6-9f4d-68437ea8ee0b" (UID: "514f05c3-1404-46c6-9f4d-68437ea8ee0b"). InnerVolumeSpecName "kube-api-access-pdzkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.069773 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-tczkf" event={"ID":"980f5756-4935-469d-933b-f4e339ded9a4","Type":"ContainerDied","Data":"7daa785220c2781aa57144adb5ff002190bf0947817569818a884fa78dbf2974"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.069821 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7daa785220c2781aa57144adb5ff002190bf0947817569818a884fa78dbf2974" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.069882 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-tczkf" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.079722 4860 generic.go:334] "Generic (PLEG): container finished" podID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerID="3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f" exitCode=0 Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.079797 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xlp" event={"ID":"f20cb95e-5480-4c9c-859f-0b03d679ab06","Type":"ContainerDied","Data":"3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.079830 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9xlp" event={"ID":"f20cb95e-5480-4c9c-859f-0b03d679ab06","Type":"ContainerDied","Data":"39591daa264ba7bebe5fdc529015addd733110c1c54ed6b98d8a162a754e8d60"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.079849 4860 scope.go:117] "RemoveContainer" containerID="3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.080183 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9xlp" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.081658 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-trusted-ca\") pod \"403ca5f6-bd52-40de-88d6-5151b3202c76\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.081723 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg46p\" (UniqueName: \"kubernetes.io/projected/f20cb95e-5480-4c9c-859f-0b03d679ab06-kube-api-access-kg46p\") pod \"f20cb95e-5480-4c9c-859f-0b03d679ab06\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.081753 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-operator-metrics\") pod \"403ca5f6-bd52-40de-88d6-5151b3202c76\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.081810 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqvlf\" (UniqueName: \"kubernetes.io/projected/403ca5f6-bd52-40de-88d6-5151b3202c76-kube-api-access-hqvlf\") pod \"403ca5f6-bd52-40de-88d6-5151b3202c76\" (UID: \"403ca5f6-bd52-40de-88d6-5151b3202c76\") " Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.081828 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-catalog-content\") pod \"f20cb95e-5480-4c9c-859f-0b03d679ab06\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.081851 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-utilities\") pod \"f20cb95e-5480-4c9c-859f-0b03d679ab06\" (UID: \"f20cb95e-5480-4c9c-859f-0b03d679ab06\") " Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.082524 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2690d8b-c7f7-4e71-af44-33444e4d6187-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.082541 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.082551 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdzkg\" (UniqueName: \"kubernetes.io/projected/514f05c3-1404-46c6-9f4d-68437ea8ee0b-kube-api-access-pdzkg\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.082561 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f84f111-5991-4e78-9508-82283b8e36f7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.083673 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-utilities" (OuterVolumeSpecName: "utilities") pod "f20cb95e-5480-4c9c-859f-0b03d679ab06" (UID: "f20cb95e-5480-4c9c-859f-0b03d679ab06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.086485 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "403ca5f6-bd52-40de-88d6-5151b3202c76" (UID: "403ca5f6-bd52-40de-88d6-5151b3202c76"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.092005 4860 generic.go:334] "Generic (PLEG): container finished" podID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerID="71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143" exitCode=0 Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.092131 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.092365 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" event={"ID":"403ca5f6-bd52-40de-88d6-5151b3202c76","Type":"ContainerDied","Data":"71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.093044 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f20cb95e-5480-4c9c-859f-0b03d679ab06-kube-api-access-kg46p" (OuterVolumeSpecName: "kube-api-access-kg46p") pod "f20cb95e-5480-4c9c-859f-0b03d679ab06" (UID: "f20cb95e-5480-4c9c-859f-0b03d679ab06"). InnerVolumeSpecName "kube-api-access-kg46p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.093077 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhgh4" event={"ID":"403ca5f6-bd52-40de-88d6-5151b3202c76","Type":"ContainerDied","Data":"2383508dcb35c927549b146a858bc10ffc0e92b071146282263459310c1e4a93"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.100012 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/403ca5f6-bd52-40de-88d6-5151b3202c76-kube-api-access-hqvlf" (OuterVolumeSpecName: "kube-api-access-hqvlf") pod "403ca5f6-bd52-40de-88d6-5151b3202c76" (UID: "403ca5f6-bd52-40de-88d6-5151b3202c76"). InnerVolumeSpecName "kube-api-access-hqvlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.103497 4860 generic.go:334] "Generic (PLEG): container finished" podID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerID="dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958" exitCode=0 Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.103846 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8bh" event={"ID":"514f05c3-1404-46c6-9f4d-68437ea8ee0b","Type":"ContainerDied","Data":"dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.104020 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8bh" event={"ID":"514f05c3-1404-46c6-9f4d-68437ea8ee0b","Type":"ContainerDied","Data":"db09468d977aabd81ce312da99eaa8c50b25e5282affd310a612fbfda038e94c"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.104215 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qq8bh" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.108607 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qkfjv"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.110035 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "403ca5f6-bd52-40de-88d6-5151b3202c76" (UID: "403ca5f6-bd52-40de-88d6-5151b3202c76"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.114032 4860 generic.go:334] "Generic (PLEG): container finished" podID="4f84f111-5991-4e78-9508-82283b8e36f7" containerID="bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff" exitCode=0 Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.114129 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5w95" event={"ID":"4f84f111-5991-4e78-9508-82283b8e36f7","Type":"ContainerDied","Data":"bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.114155 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5w95" event={"ID":"4f84f111-5991-4e78-9508-82283b8e36f7","Type":"ContainerDied","Data":"1f538c1360593e9a410b70b066b34c33f5665e2dac735a2212ce3b3dbdf2dce0"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.114267 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5w95" Mar 20 11:02:05 crc kubenswrapper[4860]: W0320 11:02:05.122806 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod489f9463_a47c_4635_aad3_866e47a2c97f.slice/crio-a32770f37d964985947e6b18171cb2879bd6d548a75a3c5df5ff64cac3f0250e WatchSource:0}: Error finding container a32770f37d964985947e6b18171cb2879bd6d548a75a3c5df5ff64cac3f0250e: Status 404 returned error can't find the container with id a32770f37d964985947e6b18171cb2879bd6d548a75a3c5df5ff64cac3f0250e Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.123996 4860 generic.go:334] "Generic (PLEG): container finished" podID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerID="8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4" exitCode=0 Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.124284 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79b7" event={"ID":"d2690d8b-c7f7-4e71-af44-33444e4d6187","Type":"ContainerDied","Data":"8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.124423 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79b7" event={"ID":"d2690d8b-c7f7-4e71-af44-33444e4d6187","Type":"ContainerDied","Data":"d3e5a4f45fcca5a9f1ea6868d200bf518732a2225ce154b3a8c6e2fe9edbe0fc"} Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.125866 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n79b7" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.133248 4860 scope.go:117] "RemoveContainer" containerID="01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.134123 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f20cb95e-5480-4c9c-859f-0b03d679ab06" (UID: "f20cb95e-5480-4c9c-859f-0b03d679ab06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.158841 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5w95"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.186612 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w5w95"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.186982 4860 scope.go:117] "RemoveContainer" containerID="3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.190817 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.190856 4860 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.190868 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg46p\" (UniqueName: \"kubernetes.io/projected/f20cb95e-5480-4c9c-859f-0b03d679ab06-kube-api-access-kg46p\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.190879 4860 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/403ca5f6-bd52-40de-88d6-5151b3202c76-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.190889 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqvlf\" (UniqueName: \"kubernetes.io/projected/403ca5f6-bd52-40de-88d6-5151b3202c76-kube-api-access-hqvlf\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.190904 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20cb95e-5480-4c9c-859f-0b03d679ab06-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.222317 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zjfp8"] Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.222964 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.222986 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223006 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223038 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223053 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223060 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223077 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223083 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223095 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223100 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223115 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223121 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223134 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223140 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223160 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223166 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223179 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223186 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223194 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223199 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="extract-utilities" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223210 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223236 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223244 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223250 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223263 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223269 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223279 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223288 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="extract-content" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.223302 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980f5756-4935-469d-933b-f4e339ded9a4" containerName="oc" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223308 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="980f5756-4935-469d-933b-f4e339ded9a4" containerName="oc" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223465 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223480 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="980f5756-4935-469d-933b-f4e339ded9a4" containerName="oc" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223491 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223502 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223535 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.223547 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" containerName="registry-server" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.224242 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.242698 4860 scope.go:117] "RemoveContainer" containerID="3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.243464 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f\": container with ID starting with 3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f not found: ID does not exist" containerID="3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.243641 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f"} err="failed to get container status \"3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f\": rpc error: code = NotFound desc = could not find container \"3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f\": container with ID starting with 3d9773a58e39a2f11588636465b237f3ac7e55d6ecd57abbb192e60cfa1df11f not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.243802 4860 scope.go:117] "RemoveContainer" containerID="01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.244246 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63\": container with ID starting with 01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63 not found: ID does not exist" containerID="01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.244278 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63"} err="failed to get container status \"01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63\": rpc error: code = NotFound desc = could not find container \"01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63\": container with ID starting with 01d41166d3fd0e46f3b970c35d6bd76db150d430d4adb17e4e8f0fd1e47c7e63 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.244301 4860 scope.go:117] "RemoveContainer" containerID="3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.244702 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac\": container with ID starting with 3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac not found: ID does not exist" containerID="3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.244749 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac"} err="failed to get container status \"3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac\": rpc error: code = NotFound desc = could not find container \"3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac\": container with ID starting with 3c2e183391f543c0b20ce684b31a9a8169bae71ceb6b6c5028bb8d294a9daeac not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.244776 4860 scope.go:117] "RemoveContainer" containerID="71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.248024 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zjfp8"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.250831 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "514f05c3-1404-46c6-9f4d-68437ea8ee0b" (UID: "514f05c3-1404-46c6-9f4d-68437ea8ee0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.281060 4860 scope.go:117] "RemoveContainer" containerID="857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.288588 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n79b7"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294062 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkzrv\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-kube-api-access-tkzrv\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294127 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/09980524-2db0-4279-8e7c-09d82081be4b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294204 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/09980524-2db0-4279-8e7c-09d82081be4b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294256 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-bound-sa-token\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294300 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-registry-tls\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294325 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/09980524-2db0-4279-8e7c-09d82081be4b-registry-certificates\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294348 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09980524-2db0-4279-8e7c-09d82081be4b-trusted-ca\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294386 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.294443 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514f05c3-1404-46c6-9f4d-68437ea8ee0b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.309928 4860 scope.go:117] "RemoveContainer" containerID="71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.311039 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143\": container with ID starting with 71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143 not found: ID does not exist" containerID="71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.311102 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143"} err="failed to get container status \"71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143\": rpc error: code = NotFound desc = could not find container \"71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143\": container with ID starting with 71b2cb77686050f112cbf04ff04d1685bf9b88c82f8a98a4da293848bc821143 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.311146 4860 scope.go:117] "RemoveContainer" containerID="857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.311707 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9\": container with ID starting with 857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9 not found: ID does not exist" containerID="857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.311778 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9"} err="failed to get container status \"857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9\": rpc error: code = NotFound desc = could not find container \"857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9\": container with ID starting with 857625adfd41b4c07f7a8c47d596e9428a9381932a0175f6ddfcb8f7de0229a9 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.311827 4860 scope.go:117] "RemoveContainer" containerID="dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.314237 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n79b7"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.324646 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.328259 4860 scope.go:117] "RemoveContainer" containerID="844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.374614 4860 scope.go:117] "RemoveContainer" containerID="62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.388304 4860 scope.go:117] "RemoveContainer" containerID="dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.388800 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958\": container with ID starting with dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958 not found: ID does not exist" containerID="dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.388849 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958"} err="failed to get container status \"dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958\": rpc error: code = NotFound desc = could not find container \"dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958\": container with ID starting with dd62fe82165d292d4799e06ebb35b908db080de2fe5e228a5d81891d222fa958 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.388881 4860 scope.go:117] "RemoveContainer" containerID="844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.389650 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22\": container with ID starting with 844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22 not found: ID does not exist" containerID="844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.389687 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22"} err="failed to get container status \"844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22\": rpc error: code = NotFound desc = could not find container \"844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22\": container with ID starting with 844c418dac774548dbf403e3c458e1c031e1eab485a8882313d1a36213143b22 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.389721 4860 scope.go:117] "RemoveContainer" containerID="62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.390168 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847\": container with ID starting with 62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847 not found: ID does not exist" containerID="62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.390237 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847"} err="failed to get container status \"62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847\": rpc error: code = NotFound desc = could not find container \"62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847\": container with ID starting with 62152cf84060d8945786af63e9ccf7c263d87bdc6a8315bb08a5eacc7087c847 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.390279 4860 scope.go:117] "RemoveContainer" containerID="bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.395988 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/09980524-2db0-4279-8e7c-09d82081be4b-registry-certificates\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.396059 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09980524-2db0-4279-8e7c-09d82081be4b-trusted-ca\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.396132 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkzrv\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-kube-api-access-tkzrv\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.396175 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/09980524-2db0-4279-8e7c-09d82081be4b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.396247 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/09980524-2db0-4279-8e7c-09d82081be4b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.396269 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-bound-sa-token\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.396302 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-registry-tls\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.396970 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/09980524-2db0-4279-8e7c-09d82081be4b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.397479 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/09980524-2db0-4279-8e7c-09d82081be4b-registry-certificates\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.398844 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09980524-2db0-4279-8e7c-09d82081be4b-trusted-ca\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.402104 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-registry-tls\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.402845 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/09980524-2db0-4279-8e7c-09d82081be4b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.409208 4860 scope.go:117] "RemoveContainer" containerID="1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.413581 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-bound-sa-token\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.417499 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkzrv\" (UniqueName: \"kubernetes.io/projected/09980524-2db0-4279-8e7c-09d82081be4b-kube-api-access-tkzrv\") pod \"image-registry-66df7c8f76-zjfp8\" (UID: \"09980524-2db0-4279-8e7c-09d82081be4b\") " pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.423510 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f84f111-5991-4e78-9508-82283b8e36f7" path="/var/lib/kubelet/pods/4f84f111-5991-4e78-9508-82283b8e36f7/volumes" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.424411 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2690d8b-c7f7-4e71-af44-33444e4d6187" path="/var/lib/kubelet/pods/d2690d8b-c7f7-4e71-af44-33444e4d6187/volumes" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.448947 4860 scope.go:117] "RemoveContainer" containerID="6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.466408 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhgh4"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.470492 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhgh4"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.497417 4860 scope.go:117] "RemoveContainer" containerID="bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.497546 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xlp"] Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.501346 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff\": container with ID starting with bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff not found: ID does not exist" containerID="bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.501379 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff"} err="failed to get container status \"bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff\": rpc error: code = NotFound desc = could not find container \"bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff\": container with ID starting with bd0572960d7229285fe10a8cb3d58bbc9ca87bc5ab990b09db80034d14aa31ff not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.501431 4860 scope.go:117] "RemoveContainer" containerID="1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.506753 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532\": container with ID starting with 1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532 not found: ID does not exist" containerID="1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.506805 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532"} err="failed to get container status \"1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532\": rpc error: code = NotFound desc = could not find container \"1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532\": container with ID starting with 1e4d0d88e053aac4fbcbc20ef32b8805ad5fa659b78c3257d2c273ebeb4ec532 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.506854 4860 scope.go:117] "RemoveContainer" containerID="6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.507822 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe\": container with ID starting with 6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe not found: ID does not exist" containerID="6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.507886 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe"} err="failed to get container status \"6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe\": rpc error: code = NotFound desc = could not find container \"6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe\": container with ID starting with 6a6c388a79209365a4c51728af06e871b437cb5dd7b79151114287e046dc0dfe not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.507934 4860 scope.go:117] "RemoveContainer" containerID="8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.510460 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9xlp"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.514320 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qq8bh"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.517125 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qq8bh"] Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.565846 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.566864 4860 scope.go:117] "RemoveContainer" containerID="dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.597981 4860 scope.go:117] "RemoveContainer" containerID="fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.618898 4860 scope.go:117] "RemoveContainer" containerID="8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.619845 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4\": container with ID starting with 8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4 not found: ID does not exist" containerID="8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.619885 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4"} err="failed to get container status \"8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4\": rpc error: code = NotFound desc = could not find container \"8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4\": container with ID starting with 8e14e348104a685ae72ae8d4fc93750bfae41d97762c8a26ba93a04b1dc8a3d4 not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.619923 4860 scope.go:117] "RemoveContainer" containerID="dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.620422 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a\": container with ID starting with dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a not found: ID does not exist" containerID="dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.620456 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a"} err="failed to get container status \"dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a\": rpc error: code = NotFound desc = could not find container \"dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a\": container with ID starting with dca022e41de9a1a813fac9c51ffa02dd710183b0522c00eebac503fc8a8a606a not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.620506 4860 scope.go:117] "RemoveContainer" containerID="fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d" Mar 20 11:02:05 crc kubenswrapper[4860]: E0320 11:02:05.620794 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d\": container with ID starting with fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d not found: ID does not exist" containerID="fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.620824 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d"} err="failed to get container status \"fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d\": rpc error: code = NotFound desc = could not find container \"fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d\": container with ID starting with fc706b2c173f49d2a44bb4c1c738033acd0d9e47df9115abcd3736cf3695dc3d not found: ID does not exist" Mar 20 11:02:05 crc kubenswrapper[4860]: I0320 11:02:05.796124 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zjfp8"] Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.154885 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" event={"ID":"09980524-2db0-4279-8e7c-09d82081be4b","Type":"ContainerStarted","Data":"bf928a9e95ea76a816887b1b1988b6034b7e02a96cd7b5632d3929848377094e"} Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.158837 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" event={"ID":"489f9463-a47c-4635-aad3-866e47a2c97f","Type":"ContainerStarted","Data":"d7908b32dfb6ed12940916e4e1523db0b8b1a86b415031e69430c3cc2102f94f"} Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.158890 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" event={"ID":"489f9463-a47c-4635-aad3-866e47a2c97f","Type":"ContainerStarted","Data":"a32770f37d964985947e6b18171cb2879bd6d548a75a3c5df5ff64cac3f0250e"} Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.159511 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.166804 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.178680 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qkfjv" podStartSLOduration=2.178650474 podStartE2EDuration="2.178650474s" podCreationTimestamp="2026-03-20 11:02:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:02:06.17741811 +0000 UTC m=+450.398779018" watchObservedRunningTime="2026-03-20 11:02:06.178650474 +0000 UTC m=+450.400011372" Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.974597 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zrj5v"] Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.974982 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" containerName="marketplace-operator" Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.975896 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.978445 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 11:02:06 crc kubenswrapper[4860]: I0320 11:02:06.988404 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrj5v"] Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.020760 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hqhc\" (UniqueName: \"kubernetes.io/projected/8d34a762-55ad-41cb-994e-d4707bfebe22-kube-api-access-4hqhc\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.021014 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d34a762-55ad-41cb-994e-d4707bfebe22-utilities\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.021198 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d34a762-55ad-41cb-994e-d4707bfebe22-catalog-content\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.123582 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d34a762-55ad-41cb-994e-d4707bfebe22-catalog-content\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.123934 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hqhc\" (UniqueName: \"kubernetes.io/projected/8d34a762-55ad-41cb-994e-d4707bfebe22-kube-api-access-4hqhc\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.123997 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d34a762-55ad-41cb-994e-d4707bfebe22-utilities\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.124944 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d34a762-55ad-41cb-994e-d4707bfebe22-utilities\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.125662 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d34a762-55ad-41cb-994e-d4707bfebe22-catalog-content\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.148675 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hqhc\" (UniqueName: \"kubernetes.io/projected/8d34a762-55ad-41cb-994e-d4707bfebe22-kube-api-access-4hqhc\") pod \"redhat-operators-zrj5v\" (UID: \"8d34a762-55ad-41cb-994e-d4707bfebe22\") " pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.172642 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" event={"ID":"09980524-2db0-4279-8e7c-09d82081be4b","Type":"ContainerStarted","Data":"7645810f63db6d98f89257c6ff5de9d6f512d0b45162b05f786543ac22180372"} Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.172917 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.204210 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" podStartSLOduration=2.204172965 podStartE2EDuration="2.204172965s" podCreationTimestamp="2026-03-20 11:02:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:02:07.194909628 +0000 UTC m=+451.416270526" watchObservedRunningTime="2026-03-20 11:02:07.204172965 +0000 UTC m=+451.425533863" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.292115 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.427034 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="403ca5f6-bd52-40de-88d6-5151b3202c76" path="/var/lib/kubelet/pods/403ca5f6-bd52-40de-88d6-5151b3202c76/volumes" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.427679 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="514f05c3-1404-46c6-9f4d-68437ea8ee0b" path="/var/lib/kubelet/pods/514f05c3-1404-46c6-9f4d-68437ea8ee0b/volumes" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.428389 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f20cb95e-5480-4c9c-859f-0b03d679ab06" path="/var/lib/kubelet/pods/f20cb95e-5480-4c9c-859f-0b03d679ab06/volumes" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.517343 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrj5v"] Mar 20 11:02:07 crc kubenswrapper[4860]: W0320 11:02:07.531735 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d34a762_55ad_41cb_994e_d4707bfebe22.slice/crio-9ed745147141abe4b27dbd58cfbd4de2fa83854d7c504a6c3c963c48853161c3 WatchSource:0}: Error finding container 9ed745147141abe4b27dbd58cfbd4de2fa83854d7c504a6c3c963c48853161c3: Status 404 returned error can't find the container with id 9ed745147141abe4b27dbd58cfbd4de2fa83854d7c504a6c3c963c48853161c3 Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.984779 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s2x6p"] Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.986713 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:07 crc kubenswrapper[4860]: I0320 11:02:07.989203 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.044121 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-catalog-content\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.044250 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-utilities\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.044296 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-267pr\" (UniqueName: \"kubernetes.io/projected/da2b2cab-e4d8-48ed-b198-7aff45927348-kube-api-access-267pr\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.045946 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s2x6p"] Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.145629 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-utilities\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.145725 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-267pr\" (UniqueName: \"kubernetes.io/projected/da2b2cab-e4d8-48ed-b198-7aff45927348-kube-api-access-267pr\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.145804 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-catalog-content\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.146526 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-catalog-content\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.146655 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-utilities\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.167620 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-267pr\" (UniqueName: \"kubernetes.io/projected/da2b2cab-e4d8-48ed-b198-7aff45927348-kube-api-access-267pr\") pod \"community-operators-s2x6p\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.182951 4860 generic.go:334] "Generic (PLEG): container finished" podID="8d34a762-55ad-41cb-994e-d4707bfebe22" containerID="bded5492155070d57a139dff89185221af2941a144e60a6b22a8e6eae05f55ca" exitCode=0 Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.183036 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj5v" event={"ID":"8d34a762-55ad-41cb-994e-d4707bfebe22","Type":"ContainerDied","Data":"bded5492155070d57a139dff89185221af2941a144e60a6b22a8e6eae05f55ca"} Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.183104 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj5v" event={"ID":"8d34a762-55ad-41cb-994e-d4707bfebe22","Type":"ContainerStarted","Data":"9ed745147141abe4b27dbd58cfbd4de2fa83854d7c504a6c3c963c48853161c3"} Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.302599 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:08 crc kubenswrapper[4860]: I0320 11:02:08.536602 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s2x6p"] Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.191309 4860 generic.go:334] "Generic (PLEG): container finished" podID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerID="a4f90ca93d3e43497e705c4521beb4348408ab8d69ef5b2bcd7028aec3d686d5" exitCode=0 Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.191455 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x6p" event={"ID":"da2b2cab-e4d8-48ed-b198-7aff45927348","Type":"ContainerDied","Data":"a4f90ca93d3e43497e705c4521beb4348408ab8d69ef5b2bcd7028aec3d686d5"} Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.191974 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x6p" event={"ID":"da2b2cab-e4d8-48ed-b198-7aff45927348","Type":"ContainerStarted","Data":"0985eda395c30cb4fc11c5a030b8aabf733cd8d60366ed8ecb07d45313940c24"} Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.382146 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-47qz8"] Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.387307 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.398389 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.408356 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-47qz8"] Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.466745 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bt5v\" (UniqueName: \"kubernetes.io/projected/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-kube-api-access-2bt5v\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.466795 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-utilities\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.466852 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-catalog-content\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.568172 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bt5v\" (UniqueName: \"kubernetes.io/projected/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-kube-api-access-2bt5v\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.568267 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-utilities\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.568529 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-catalog-content\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.568945 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-utilities\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.569183 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-catalog-content\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.588068 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bt5v\" (UniqueName: \"kubernetes.io/projected/a3a77828-39d7-4547-ba09-26a9a0fb8e7b-kube-api-access-2bt5v\") pod \"certified-operators-47qz8\" (UID: \"a3a77828-39d7-4547-ba09-26a9a0fb8e7b\") " pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.723924 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:09 crc kubenswrapper[4860]: I0320 11:02:09.980675 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-47qz8"] Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.200843 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x6p" event={"ID":"da2b2cab-e4d8-48ed-b198-7aff45927348","Type":"ContainerStarted","Data":"3692f916ebd9e82f76728a61b0840c3354adb6672a36ba82bf89b317d7536cc6"} Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.203465 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj5v" event={"ID":"8d34a762-55ad-41cb-994e-d4707bfebe22","Type":"ContainerStarted","Data":"baadb0aebaf718ba429229f6967e10ea610d78bad4be71114d5b27b336f733eb"} Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.205185 4860 generic.go:334] "Generic (PLEG): container finished" podID="a3a77828-39d7-4547-ba09-26a9a0fb8e7b" containerID="acb2012425b8eb6b69ac49a4cfcbb9f33c79aef77a6ddd96d2110907e1765fb8" exitCode=0 Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.205254 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47qz8" event={"ID":"a3a77828-39d7-4547-ba09-26a9a0fb8e7b","Type":"ContainerDied","Data":"acb2012425b8eb6b69ac49a4cfcbb9f33c79aef77a6ddd96d2110907e1765fb8"} Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.205276 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47qz8" event={"ID":"a3a77828-39d7-4547-ba09-26a9a0fb8e7b","Type":"ContainerStarted","Data":"c72cf96052172bb404f7f16a65c7ab617ea74b3d559d23fc67c862ea51f205b2"} Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.377317 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mhds4"] Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.378524 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.382062 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.395351 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhds4"] Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.484362 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-utilities\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.484457 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-catalog-content\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.484537 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtxjr\" (UniqueName: \"kubernetes.io/projected/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-kube-api-access-dtxjr\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.585724 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-utilities\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.585791 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-catalog-content\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.585817 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtxjr\" (UniqueName: \"kubernetes.io/projected/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-kube-api-access-dtxjr\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.586551 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-catalog-content\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.586865 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-utilities\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.606537 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtxjr\" (UniqueName: \"kubernetes.io/projected/f820689f-28ee-4cbe-bf7b-049d9ec6ef64-kube-api-access-dtxjr\") pod \"redhat-marketplace-mhds4\" (UID: \"f820689f-28ee-4cbe-bf7b-049d9ec6ef64\") " pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.693699 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:10 crc kubenswrapper[4860]: I0320 11:02:10.902895 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhds4"] Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.216436 4860 generic.go:334] "Generic (PLEG): container finished" podID="8d34a762-55ad-41cb-994e-d4707bfebe22" containerID="baadb0aebaf718ba429229f6967e10ea610d78bad4be71114d5b27b336f733eb" exitCode=0 Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.216563 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj5v" event={"ID":"8d34a762-55ad-41cb-994e-d4707bfebe22","Type":"ContainerDied","Data":"baadb0aebaf718ba429229f6967e10ea610d78bad4be71114d5b27b336f733eb"} Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.220690 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47qz8" event={"ID":"a3a77828-39d7-4547-ba09-26a9a0fb8e7b","Type":"ContainerStarted","Data":"cbb4bb657f80c7b555a0587373c2ce4e2b3f566ea4f63e69e63e02e2a6b2ec7a"} Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.233208 4860 generic.go:334] "Generic (PLEG): container finished" podID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerID="3692f916ebd9e82f76728a61b0840c3354adb6672a36ba82bf89b317d7536cc6" exitCode=0 Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.233329 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x6p" event={"ID":"da2b2cab-e4d8-48ed-b198-7aff45927348","Type":"ContainerDied","Data":"3692f916ebd9e82f76728a61b0840c3354adb6672a36ba82bf89b317d7536cc6"} Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.247359 4860 generic.go:334] "Generic (PLEG): container finished" podID="f820689f-28ee-4cbe-bf7b-049d9ec6ef64" containerID="825da8bd923b4ec65455cb3fc9a9543a1bb3dde599c10fcad2d8d0915679c73b" exitCode=0 Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.247414 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhds4" event={"ID":"f820689f-28ee-4cbe-bf7b-049d9ec6ef64","Type":"ContainerDied","Data":"825da8bd923b4ec65455cb3fc9a9543a1bb3dde599c10fcad2d8d0915679c73b"} Mar 20 11:02:11 crc kubenswrapper[4860]: I0320 11:02:11.247454 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhds4" event={"ID":"f820689f-28ee-4cbe-bf7b-049d9ec6ef64","Type":"ContainerStarted","Data":"3e386c36a91903733f639f25b71736ac81412f01a413b43f4de17bbfd5d41ebb"} Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.257769 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj5v" event={"ID":"8d34a762-55ad-41cb-994e-d4707bfebe22","Type":"ContainerStarted","Data":"1441bba4ff54877f3670f5cf889c2f3f29ba6e9d0688c3c6feb2a2b3d60283bf"} Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.259599 4860 generic.go:334] "Generic (PLEG): container finished" podID="a3a77828-39d7-4547-ba09-26a9a0fb8e7b" containerID="cbb4bb657f80c7b555a0587373c2ce4e2b3f566ea4f63e69e63e02e2a6b2ec7a" exitCode=0 Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.259647 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47qz8" event={"ID":"a3a77828-39d7-4547-ba09-26a9a0fb8e7b","Type":"ContainerDied","Data":"cbb4bb657f80c7b555a0587373c2ce4e2b3f566ea4f63e69e63e02e2a6b2ec7a"} Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.270805 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x6p" event={"ID":"da2b2cab-e4d8-48ed-b198-7aff45927348","Type":"ContainerStarted","Data":"c529e06c44dcbe461813fdd459369a14b6b219590561c7a4393e8a5bf49d0970"} Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.274240 4860 generic.go:334] "Generic (PLEG): container finished" podID="f820689f-28ee-4cbe-bf7b-049d9ec6ef64" containerID="d9b383675be39b9fd832a1c5911831cc8302efe7e48e72c595029c9c5dc8b325" exitCode=0 Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.274315 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhds4" event={"ID":"f820689f-28ee-4cbe-bf7b-049d9ec6ef64","Type":"ContainerDied","Data":"d9b383675be39b9fd832a1c5911831cc8302efe7e48e72c595029c9c5dc8b325"} Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.280523 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zrj5v" podStartSLOduration=2.727340365 podStartE2EDuration="6.280499513s" podCreationTimestamp="2026-03-20 11:02:06 +0000 UTC" firstStartedPulling="2026-03-20 11:02:08.18630992 +0000 UTC m=+452.407670838" lastFinishedPulling="2026-03-20 11:02:11.739469088 +0000 UTC m=+455.960829986" observedRunningTime="2026-03-20 11:02:12.276565995 +0000 UTC m=+456.497926893" watchObservedRunningTime="2026-03-20 11:02:12.280499513 +0000 UTC m=+456.501860411" Mar 20 11:02:12 crc kubenswrapper[4860]: I0320 11:02:12.327757 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s2x6p" podStartSLOduration=2.586557625 podStartE2EDuration="5.327735343s" podCreationTimestamp="2026-03-20 11:02:07 +0000 UTC" firstStartedPulling="2026-03-20 11:02:09.264286844 +0000 UTC m=+453.485647742" lastFinishedPulling="2026-03-20 11:02:12.005464562 +0000 UTC m=+456.226825460" observedRunningTime="2026-03-20 11:02:12.326652973 +0000 UTC m=+456.548013881" watchObservedRunningTime="2026-03-20 11:02:12.327735343 +0000 UTC m=+456.549096241" Mar 20 11:02:13 crc kubenswrapper[4860]: I0320 11:02:13.302671 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47qz8" event={"ID":"a3a77828-39d7-4547-ba09-26a9a0fb8e7b","Type":"ContainerStarted","Data":"1b3ec51cf2c526996dc1328276334f591b28f8d8fe3810e296d7cb4924676dee"} Mar 20 11:02:14 crc kubenswrapper[4860]: I0320 11:02:14.311997 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhds4" event={"ID":"f820689f-28ee-4cbe-bf7b-049d9ec6ef64","Type":"ContainerStarted","Data":"4fa395af66801548de446165dab28cdcfc56e658660bf981d9dc1587a6995b23"} Mar 20 11:02:14 crc kubenswrapper[4860]: I0320 11:02:14.333831 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mhds4" podStartSLOduration=2.174703475 podStartE2EDuration="4.333803016s" podCreationTimestamp="2026-03-20 11:02:10 +0000 UTC" firstStartedPulling="2026-03-20 11:02:11.249151709 +0000 UTC m=+455.470512607" lastFinishedPulling="2026-03-20 11:02:13.40825125 +0000 UTC m=+457.629612148" observedRunningTime="2026-03-20 11:02:14.333323643 +0000 UTC m=+458.554684571" watchObservedRunningTime="2026-03-20 11:02:14.333803016 +0000 UTC m=+458.555163914" Mar 20 11:02:14 crc kubenswrapper[4860]: I0320 11:02:14.337241 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-47qz8" podStartSLOduration=2.686000779 podStartE2EDuration="5.33721725s" podCreationTimestamp="2026-03-20 11:02:09 +0000 UTC" firstStartedPulling="2026-03-20 11:02:10.206850965 +0000 UTC m=+454.428211863" lastFinishedPulling="2026-03-20 11:02:12.858067436 +0000 UTC m=+457.079428334" observedRunningTime="2026-03-20 11:02:13.332675087 +0000 UTC m=+457.554035985" watchObservedRunningTime="2026-03-20 11:02:14.33721725 +0000 UTC m=+458.558578148" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.292428 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.292514 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.505154 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.505746 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.505782 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.505851 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.514333 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.514682 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.517239 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.536075 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.614110 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.619748 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:02:17 crc kubenswrapper[4860]: I0320 11:02:17.715165 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.303391 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.303989 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.333867 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zrj5v" podUID="8d34a762-55ad-41cb-994e-d4707bfebe22" containerName="registry-server" probeResult="failure" output=< Mar 20 11:02:18 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 11:02:18 crc kubenswrapper[4860]: > Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.339515 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f298e33c11e6b58e469b348abad4c52ecd23dda75a18e6fcff7999ea1729b44e"} Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.339565 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"262672598caa4e4a0085905ae9658928784b225b2f71ee03f7b7fe31491f435e"} Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.341723 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"002c0b7c930a53311b4db7b99e5c4e0aa5ce0bfcae9c9717d370ec94841d28c5"} Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.341759 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7ec992ed2adce225654dbd6f832cdfc256c8d935244ceb0addad630e60b9ba73"} Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.341916 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.343702 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b906bee3273b9e266d2ede1cbd8a9d8d030f9120dad40f07f2edf1d66eb0fd8e"} Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.343753 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3decbd652db47b6a9f28fcac12f076a7a7f283d40c4777cd025604439b3c1fbd"} Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.349606 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:18 crc kubenswrapper[4860]: I0320 11:02:18.405909 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:02:19 crc kubenswrapper[4860]: I0320 11:02:19.724799 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:19 crc kubenswrapper[4860]: I0320 11:02:19.725278 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:19 crc kubenswrapper[4860]: I0320 11:02:19.773682 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:20 crc kubenswrapper[4860]: I0320 11:02:20.402067 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-47qz8" Mar 20 11:02:20 crc kubenswrapper[4860]: I0320 11:02:20.693884 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:20 crc kubenswrapper[4860]: I0320 11:02:20.693971 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:20 crc kubenswrapper[4860]: I0320 11:02:20.747931 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:21 crc kubenswrapper[4860]: I0320 11:02:21.434884 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mhds4" Mar 20 11:02:22 crc kubenswrapper[4860]: I0320 11:02:22.344858 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:02:22 crc kubenswrapper[4860]: I0320 11:02:22.344934 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:02:25 crc kubenswrapper[4860]: I0320 11:02:25.575968 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-zjfp8" Mar 20 11:02:25 crc kubenswrapper[4860]: I0320 11:02:25.668712 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dbgm"] Mar 20 11:02:27 crc kubenswrapper[4860]: I0320 11:02:27.350965 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:27 crc kubenswrapper[4860]: I0320 11:02:27.439629 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zrj5v" Mar 20 11:02:50 crc kubenswrapper[4860]: I0320 11:02:50.732130 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" podUID="39b41087-226b-4f73-9fc4-64616b430f2d" containerName="registry" containerID="cri-o://f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8" gracePeriod=30 Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.091913 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.216601 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-registry-tls\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.216671 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n92tj\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-kube-api-access-n92tj\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.218093 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-bound-sa-token\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.218155 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-trusted-ca\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.218331 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39b41087-226b-4f73-9fc4-64616b430f2d-ca-trust-extracted\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.218375 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39b41087-226b-4f73-9fc4-64616b430f2d-installation-pull-secrets\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.218514 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.218641 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-registry-certificates\") pod \"39b41087-226b-4f73-9fc4-64616b430f2d\" (UID: \"39b41087-226b-4f73-9fc4-64616b430f2d\") " Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.220192 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.220205 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.225383 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-kube-api-access-n92tj" (OuterVolumeSpecName: "kube-api-access-n92tj") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "kube-api-access-n92tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.226332 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.226706 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39b41087-226b-4f73-9fc4-64616b430f2d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.226813 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.232856 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.241627 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39b41087-226b-4f73-9fc4-64616b430f2d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "39b41087-226b-4f73-9fc4-64616b430f2d" (UID: "39b41087-226b-4f73-9fc4-64616b430f2d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.320612 4860 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.320682 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.320700 4860 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39b41087-226b-4f73-9fc4-64616b430f2d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.320719 4860 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39b41087-226b-4f73-9fc4-64616b430f2d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.320741 4860 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39b41087-226b-4f73-9fc4-64616b430f2d-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.320757 4860 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.320773 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n92tj\" (UniqueName: \"kubernetes.io/projected/39b41087-226b-4f73-9fc4-64616b430f2d-kube-api-access-n92tj\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.552981 4860 generic.go:334] "Generic (PLEG): container finished" podID="39b41087-226b-4f73-9fc4-64616b430f2d" containerID="f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8" exitCode=0 Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.553049 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" event={"ID":"39b41087-226b-4f73-9fc4-64616b430f2d","Type":"ContainerDied","Data":"f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8"} Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.553087 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" event={"ID":"39b41087-226b-4f73-9fc4-64616b430f2d","Type":"ContainerDied","Data":"e1dc651025a62d4a8c6173da89e1be13654b3ccc8586f46471cbc5846256700b"} Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.553111 4860 scope.go:117] "RemoveContainer" containerID="f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.553126 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8dbgm" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.580064 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dbgm"] Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.585863 4860 scope.go:117] "RemoveContainer" containerID="f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8" Mar 20 11:02:51 crc kubenswrapper[4860]: E0320 11:02:51.586726 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8\": container with ID starting with f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8 not found: ID does not exist" containerID="f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.586766 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8"} err="failed to get container status \"f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8\": rpc error: code = NotFound desc = could not find container \"f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8\": container with ID starting with f3f20d10721886ec4a9101220e9e657b2d89cba3d5a38f7e81c1ff64702925d8 not found: ID does not exist" Mar 20 11:02:51 crc kubenswrapper[4860]: I0320 11:02:51.587885 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dbgm"] Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.344892 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.345359 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.345435 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.346196 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13fba41711384af20ab8200d2257307c76fbc844be8572bbe7995f53f6fb9ca6"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.346269 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://13fba41711384af20ab8200d2257307c76fbc844be8572bbe7995f53f6fb9ca6" gracePeriod=600 Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.563249 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="13fba41711384af20ab8200d2257307c76fbc844be8572bbe7995f53f6fb9ca6" exitCode=0 Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.563275 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"13fba41711384af20ab8200d2257307c76fbc844be8572bbe7995f53f6fb9ca6"} Mar 20 11:02:52 crc kubenswrapper[4860]: I0320 11:02:52.563418 4860 scope.go:117] "RemoveContainer" containerID="2176d4712fa4ee31d4322bb4cf9433058b2da0d6000e5bfb9e7d230fdffb3eda" Mar 20 11:02:53 crc kubenswrapper[4860]: I0320 11:02:53.429267 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39b41087-226b-4f73-9fc4-64616b430f2d" path="/var/lib/kubelet/pods/39b41087-226b-4f73-9fc4-64616b430f2d/volumes" Mar 20 11:02:53 crc kubenswrapper[4860]: I0320 11:02:53.575293 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"bf060ef062b20409d88a33daa69a9b6db0709a5d9a572c42e386c266f6d32bae"} Mar 20 11:02:57 crc kubenswrapper[4860]: I0320 11:02:57.625782 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.143753 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566744-9jlnw"] Mar 20 11:04:00 crc kubenswrapper[4860]: E0320 11:04:00.144957 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b41087-226b-4f73-9fc4-64616b430f2d" containerName="registry" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.144973 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b41087-226b-4f73-9fc4-64616b430f2d" containerName="registry" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.146947 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="39b41087-226b-4f73-9fc4-64616b430f2d" containerName="registry" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.147525 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-9jlnw" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.150741 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.150827 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.154712 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.155503 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-9jlnw"] Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.267650 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2hc4\" (UniqueName: \"kubernetes.io/projected/d72f65a1-efc6-45f5-a056-d64ee5bce755-kube-api-access-j2hc4\") pod \"auto-csr-approver-29566744-9jlnw\" (UID: \"d72f65a1-efc6-45f5-a056-d64ee5bce755\") " pod="openshift-infra/auto-csr-approver-29566744-9jlnw" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.368941 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2hc4\" (UniqueName: \"kubernetes.io/projected/d72f65a1-efc6-45f5-a056-d64ee5bce755-kube-api-access-j2hc4\") pod \"auto-csr-approver-29566744-9jlnw\" (UID: \"d72f65a1-efc6-45f5-a056-d64ee5bce755\") " pod="openshift-infra/auto-csr-approver-29566744-9jlnw" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.389539 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2hc4\" (UniqueName: \"kubernetes.io/projected/d72f65a1-efc6-45f5-a056-d64ee5bce755-kube-api-access-j2hc4\") pod \"auto-csr-approver-29566744-9jlnw\" (UID: \"d72f65a1-efc6-45f5-a056-d64ee5bce755\") " pod="openshift-infra/auto-csr-approver-29566744-9jlnw" Mar 20 11:04:00 crc kubenswrapper[4860]: I0320 11:04:00.476109 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-9jlnw" Mar 20 11:04:01 crc kubenswrapper[4860]: I0320 11:04:01.536629 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-9jlnw"] Mar 20 11:04:01 crc kubenswrapper[4860]: I0320 11:04:01.548327 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:04:02 crc kubenswrapper[4860]: I0320 11:04:02.029381 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566744-9jlnw" event={"ID":"d72f65a1-efc6-45f5-a056-d64ee5bce755","Type":"ContainerStarted","Data":"d137de43b1404cfa990e48637b758554dd38a8dea4bbd8ae48f23f76b0ad72e8"} Mar 20 11:04:03 crc kubenswrapper[4860]: I0320 11:04:03.035832 4860 generic.go:334] "Generic (PLEG): container finished" podID="d72f65a1-efc6-45f5-a056-d64ee5bce755" containerID="d842a5cd77f0d9f0965cbf10a0f92313f544e8649c8e3427de05d3a92939e32e" exitCode=0 Mar 20 11:04:03 crc kubenswrapper[4860]: I0320 11:04:03.035944 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566744-9jlnw" event={"ID":"d72f65a1-efc6-45f5-a056-d64ee5bce755","Type":"ContainerDied","Data":"d842a5cd77f0d9f0965cbf10a0f92313f544e8649c8e3427de05d3a92939e32e"} Mar 20 11:04:04 crc kubenswrapper[4860]: I0320 11:04:04.270207 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-9jlnw" Mar 20 11:04:04 crc kubenswrapper[4860]: I0320 11:04:04.323273 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2hc4\" (UniqueName: \"kubernetes.io/projected/d72f65a1-efc6-45f5-a056-d64ee5bce755-kube-api-access-j2hc4\") pod \"d72f65a1-efc6-45f5-a056-d64ee5bce755\" (UID: \"d72f65a1-efc6-45f5-a056-d64ee5bce755\") " Mar 20 11:04:04 crc kubenswrapper[4860]: I0320 11:04:04.331851 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d72f65a1-efc6-45f5-a056-d64ee5bce755-kube-api-access-j2hc4" (OuterVolumeSpecName: "kube-api-access-j2hc4") pod "d72f65a1-efc6-45f5-a056-d64ee5bce755" (UID: "d72f65a1-efc6-45f5-a056-d64ee5bce755"). InnerVolumeSpecName "kube-api-access-j2hc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:04:04 crc kubenswrapper[4860]: I0320 11:04:04.425388 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2hc4\" (UniqueName: \"kubernetes.io/projected/d72f65a1-efc6-45f5-a056-d64ee5bce755-kube-api-access-j2hc4\") on node \"crc\" DevicePath \"\"" Mar 20 11:04:05 crc kubenswrapper[4860]: I0320 11:04:05.050350 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566744-9jlnw" event={"ID":"d72f65a1-efc6-45f5-a056-d64ee5bce755","Type":"ContainerDied","Data":"d137de43b1404cfa990e48637b758554dd38a8dea4bbd8ae48f23f76b0ad72e8"} Mar 20 11:04:05 crc kubenswrapper[4860]: I0320 11:04:05.050397 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d137de43b1404cfa990e48637b758554dd38a8dea4bbd8ae48f23f76b0ad72e8" Mar 20 11:04:05 crc kubenswrapper[4860]: I0320 11:04:05.050405 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-9jlnw" Mar 20 11:04:05 crc kubenswrapper[4860]: I0320 11:04:05.339187 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-5cj22"] Mar 20 11:04:05 crc kubenswrapper[4860]: I0320 11:04:05.343399 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-5cj22"] Mar 20 11:04:05 crc kubenswrapper[4860]: I0320 11:04:05.421696 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba2ab33e-6ecc-4eac-9aaa-256e6ff68236" path="/var/lib/kubelet/pods/ba2ab33e-6ecc-4eac-9aaa-256e6ff68236/volumes" Mar 20 11:04:40 crc kubenswrapper[4860]: I0320 11:04:40.711994 4860 scope.go:117] "RemoveContainer" containerID="5ed610d57137030afeaeb124289fb2f5072934d814423d8d1fd76ae4e4bbd772" Mar 20 11:04:52 crc kubenswrapper[4860]: I0320 11:04:52.344997 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:04:52 crc kubenswrapper[4860]: I0320 11:04:52.345785 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:05:22 crc kubenswrapper[4860]: I0320 11:05:22.344724 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:05:22 crc kubenswrapper[4860]: I0320 11:05:22.345672 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:05:40 crc kubenswrapper[4860]: I0320 11:05:40.751700 4860 scope.go:117] "RemoveContainer" containerID="133313b654a587091e098b1e8505700f3bdc77cfa2efebc3f2529891730788bc" Mar 20 11:05:40 crc kubenswrapper[4860]: I0320 11:05:40.800552 4860 scope.go:117] "RemoveContainer" containerID="a570fccad49b61cba0e967c8a578dc38074b1cc636e9a24780ad0104b28b074c" Mar 20 11:05:40 crc kubenswrapper[4860]: I0320 11:05:40.818702 4860 scope.go:117] "RemoveContainer" containerID="e9d5be622305f211c3e12e12dfe924fbcb59aee9d65c271786e393a4e06cec5a" Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.344993 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.346197 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.346291 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.347158 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf060ef062b20409d88a33daa69a9b6db0709a5d9a572c42e386c266f6d32bae"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.347239 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://bf060ef062b20409d88a33daa69a9b6db0709a5d9a572c42e386c266f6d32bae" gracePeriod=600 Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.776516 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="bf060ef062b20409d88a33daa69a9b6db0709a5d9a572c42e386c266f6d32bae" exitCode=0 Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.776602 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"bf060ef062b20409d88a33daa69a9b6db0709a5d9a572c42e386c266f6d32bae"} Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.777058 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"4b71d74fc211e299a940948264ea488965d39574d77d6b6b358fff8d3e35e4e3"} Mar 20 11:05:52 crc kubenswrapper[4860]: I0320 11:05:52.777098 4860 scope.go:117] "RemoveContainer" containerID="13fba41711384af20ab8200d2257307c76fbc844be8572bbe7995f53f6fb9ca6" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.139746 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566746-xphvf"] Mar 20 11:06:00 crc kubenswrapper[4860]: E0320 11:06:00.141037 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d72f65a1-efc6-45f5-a056-d64ee5bce755" containerName="oc" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.141058 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d72f65a1-efc6-45f5-a056-d64ee5bce755" containerName="oc" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.141185 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="d72f65a1-efc6-45f5-a056-d64ee5bce755" containerName="oc" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.141698 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-xphvf" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.144141 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.144885 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.144943 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.166461 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-xphvf"] Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.187071 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qrbn\" (UniqueName: \"kubernetes.io/projected/200c6cd9-8753-4805-9a49-50d3e429ea33-kube-api-access-9qrbn\") pod \"auto-csr-approver-29566746-xphvf\" (UID: \"200c6cd9-8753-4805-9a49-50d3e429ea33\") " pod="openshift-infra/auto-csr-approver-29566746-xphvf" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.288464 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qrbn\" (UniqueName: \"kubernetes.io/projected/200c6cd9-8753-4805-9a49-50d3e429ea33-kube-api-access-9qrbn\") pod \"auto-csr-approver-29566746-xphvf\" (UID: \"200c6cd9-8753-4805-9a49-50d3e429ea33\") " pod="openshift-infra/auto-csr-approver-29566746-xphvf" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.312027 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qrbn\" (UniqueName: \"kubernetes.io/projected/200c6cd9-8753-4805-9a49-50d3e429ea33-kube-api-access-9qrbn\") pod \"auto-csr-approver-29566746-xphvf\" (UID: \"200c6cd9-8753-4805-9a49-50d3e429ea33\") " pod="openshift-infra/auto-csr-approver-29566746-xphvf" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.466640 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-xphvf" Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.699916 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-xphvf"] Mar 20 11:06:00 crc kubenswrapper[4860]: I0320 11:06:00.830505 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-xphvf" event={"ID":"200c6cd9-8753-4805-9a49-50d3e429ea33","Type":"ContainerStarted","Data":"0a95a865472411ce5cf93613ee574462714ddc527db1d102e3ca1fd536269940"} Mar 20 11:06:02 crc kubenswrapper[4860]: I0320 11:06:02.846090 4860 generic.go:334] "Generic (PLEG): container finished" podID="200c6cd9-8753-4805-9a49-50d3e429ea33" containerID="6b40403be918a788bbcc242393eb71ec98682fddffb9062133713238970f5b03" exitCode=0 Mar 20 11:06:02 crc kubenswrapper[4860]: I0320 11:06:02.846147 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-xphvf" event={"ID":"200c6cd9-8753-4805-9a49-50d3e429ea33","Type":"ContainerDied","Data":"6b40403be918a788bbcc242393eb71ec98682fddffb9062133713238970f5b03"} Mar 20 11:06:04 crc kubenswrapper[4860]: I0320 11:06:04.075641 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-xphvf" Mar 20 11:06:04 crc kubenswrapper[4860]: I0320 11:06:04.139164 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qrbn\" (UniqueName: \"kubernetes.io/projected/200c6cd9-8753-4805-9a49-50d3e429ea33-kube-api-access-9qrbn\") pod \"200c6cd9-8753-4805-9a49-50d3e429ea33\" (UID: \"200c6cd9-8753-4805-9a49-50d3e429ea33\") " Mar 20 11:06:04 crc kubenswrapper[4860]: I0320 11:06:04.146523 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/200c6cd9-8753-4805-9a49-50d3e429ea33-kube-api-access-9qrbn" (OuterVolumeSpecName: "kube-api-access-9qrbn") pod "200c6cd9-8753-4805-9a49-50d3e429ea33" (UID: "200c6cd9-8753-4805-9a49-50d3e429ea33"). InnerVolumeSpecName "kube-api-access-9qrbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:06:04 crc kubenswrapper[4860]: I0320 11:06:04.240451 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qrbn\" (UniqueName: \"kubernetes.io/projected/200c6cd9-8753-4805-9a49-50d3e429ea33-kube-api-access-9qrbn\") on node \"crc\" DevicePath \"\"" Mar 20 11:06:04 crc kubenswrapper[4860]: I0320 11:06:04.858279 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-xphvf" event={"ID":"200c6cd9-8753-4805-9a49-50d3e429ea33","Type":"ContainerDied","Data":"0a95a865472411ce5cf93613ee574462714ddc527db1d102e3ca1fd536269940"} Mar 20 11:06:04 crc kubenswrapper[4860]: I0320 11:06:04.858322 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-xphvf" Mar 20 11:06:04 crc kubenswrapper[4860]: I0320 11:06:04.858324 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a95a865472411ce5cf93613ee574462714ddc527db1d102e3ca1fd536269940" Mar 20 11:06:05 crc kubenswrapper[4860]: I0320 11:06:05.140493 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-26bw9"] Mar 20 11:06:05 crc kubenswrapper[4860]: I0320 11:06:05.144207 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-26bw9"] Mar 20 11:06:05 crc kubenswrapper[4860]: I0320 11:06:05.424737 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b31d1240-ea69-4da9-9a40-70f252222d4d" path="/var/lib/kubelet/pods/b31d1240-ea69-4da9-9a40-70f252222d4d/volumes" Mar 20 11:07:40 crc kubenswrapper[4860]: I0320 11:07:40.876948 4860 scope.go:117] "RemoveContainer" containerID="e24c70c83330a3f76a9f16d28ca14d8e62ae9184fa806a34c1edf7e65a681362" Mar 20 11:07:52 crc kubenswrapper[4860]: I0320 11:07:52.344565 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:07:52 crc kubenswrapper[4860]: I0320 11:07:52.345481 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.140083 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566748-z8vsk"] Mar 20 11:08:00 crc kubenswrapper[4860]: E0320 11:08:00.141046 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200c6cd9-8753-4805-9a49-50d3e429ea33" containerName="oc" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.141064 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="200c6cd9-8753-4805-9a49-50d3e429ea33" containerName="oc" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.141219 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="200c6cd9-8753-4805-9a49-50d3e429ea33" containerName="oc" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.141739 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-z8vsk" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.143999 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.144142 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.144311 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.182730 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-z8vsk"] Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.334161 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7pxh\" (UniqueName: \"kubernetes.io/projected/d05f5e64-f0ec-45f9-a491-7dde7bdf6538-kube-api-access-b7pxh\") pod \"auto-csr-approver-29566748-z8vsk\" (UID: \"d05f5e64-f0ec-45f9-a491-7dde7bdf6538\") " pod="openshift-infra/auto-csr-approver-29566748-z8vsk" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.436606 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7pxh\" (UniqueName: \"kubernetes.io/projected/d05f5e64-f0ec-45f9-a491-7dde7bdf6538-kube-api-access-b7pxh\") pod \"auto-csr-approver-29566748-z8vsk\" (UID: \"d05f5e64-f0ec-45f9-a491-7dde7bdf6538\") " pod="openshift-infra/auto-csr-approver-29566748-z8vsk" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.457884 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7pxh\" (UniqueName: \"kubernetes.io/projected/d05f5e64-f0ec-45f9-a491-7dde7bdf6538-kube-api-access-b7pxh\") pod \"auto-csr-approver-29566748-z8vsk\" (UID: \"d05f5e64-f0ec-45f9-a491-7dde7bdf6538\") " pod="openshift-infra/auto-csr-approver-29566748-z8vsk" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.463484 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-z8vsk" Mar 20 11:08:00 crc kubenswrapper[4860]: I0320 11:08:00.876981 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-z8vsk"] Mar 20 11:08:01 crc kubenswrapper[4860]: I0320 11:08:01.583583 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-z8vsk" event={"ID":"d05f5e64-f0ec-45f9-a491-7dde7bdf6538","Type":"ContainerStarted","Data":"9bc9f0b9b0200e60208d540e2757fbf686cdf2e7c6ab7cbd74f90dcf986d30e6"} Mar 20 11:08:02 crc kubenswrapper[4860]: I0320 11:08:02.590262 4860 generic.go:334] "Generic (PLEG): container finished" podID="d05f5e64-f0ec-45f9-a491-7dde7bdf6538" containerID="478ee16ae7828909784a1f93be49bfc8c3fee1419599f3474cd82711371e05b3" exitCode=0 Mar 20 11:08:02 crc kubenswrapper[4860]: I0320 11:08:02.590362 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-z8vsk" event={"ID":"d05f5e64-f0ec-45f9-a491-7dde7bdf6538","Type":"ContainerDied","Data":"478ee16ae7828909784a1f93be49bfc8c3fee1419599f3474cd82711371e05b3"} Mar 20 11:08:03 crc kubenswrapper[4860]: I0320 11:08:03.839588 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-z8vsk" Mar 20 11:08:03 crc kubenswrapper[4860]: I0320 11:08:03.982803 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7pxh\" (UniqueName: \"kubernetes.io/projected/d05f5e64-f0ec-45f9-a491-7dde7bdf6538-kube-api-access-b7pxh\") pod \"d05f5e64-f0ec-45f9-a491-7dde7bdf6538\" (UID: \"d05f5e64-f0ec-45f9-a491-7dde7bdf6538\") " Mar 20 11:08:03 crc kubenswrapper[4860]: I0320 11:08:03.991274 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d05f5e64-f0ec-45f9-a491-7dde7bdf6538-kube-api-access-b7pxh" (OuterVolumeSpecName: "kube-api-access-b7pxh") pod "d05f5e64-f0ec-45f9-a491-7dde7bdf6538" (UID: "d05f5e64-f0ec-45f9-a491-7dde7bdf6538"). InnerVolumeSpecName "kube-api-access-b7pxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:08:04 crc kubenswrapper[4860]: I0320 11:08:04.084583 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7pxh\" (UniqueName: \"kubernetes.io/projected/d05f5e64-f0ec-45f9-a491-7dde7bdf6538-kube-api-access-b7pxh\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:04 crc kubenswrapper[4860]: I0320 11:08:04.606685 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-z8vsk" event={"ID":"d05f5e64-f0ec-45f9-a491-7dde7bdf6538","Type":"ContainerDied","Data":"9bc9f0b9b0200e60208d540e2757fbf686cdf2e7c6ab7cbd74f90dcf986d30e6"} Mar 20 11:08:04 crc kubenswrapper[4860]: I0320 11:08:04.607057 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bc9f0b9b0200e60208d540e2757fbf686cdf2e7c6ab7cbd74f90dcf986d30e6" Mar 20 11:08:04 crc kubenswrapper[4860]: I0320 11:08:04.607126 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-z8vsk" Mar 20 11:08:04 crc kubenswrapper[4860]: I0320 11:08:04.909911 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-tczkf"] Mar 20 11:08:04 crc kubenswrapper[4860]: I0320 11:08:04.916877 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-tczkf"] Mar 20 11:08:05 crc kubenswrapper[4860]: I0320 11:08:05.422634 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="980f5756-4935-469d-933b-f4e339ded9a4" path="/var/lib/kubelet/pods/980f5756-4935-469d-933b-f4e339ded9a4/volumes" Mar 20 11:08:22 crc kubenswrapper[4860]: I0320 11:08:22.344140 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:08:22 crc kubenswrapper[4860]: I0320 11:08:22.345329 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:08:36 crc kubenswrapper[4860]: I0320 11:08:36.725264 4860 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 11:08:40 crc kubenswrapper[4860]: I0320 11:08:40.931999 4860 scope.go:117] "RemoveContainer" containerID="5e0b7b6725e58dc6c9517f6806ff8c8ba7c117d2ad076272e2c94e40ea777f46" Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.344961 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.346058 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.346124 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.346933 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b71d74fc211e299a940948264ea488965d39574d77d6b6b358fff8d3e35e4e3"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.347005 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://4b71d74fc211e299a940948264ea488965d39574d77d6b6b358fff8d3e35e4e3" gracePeriod=600 Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.903193 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="4b71d74fc211e299a940948264ea488965d39574d77d6b6b358fff8d3e35e4e3" exitCode=0 Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.904823 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"4b71d74fc211e299a940948264ea488965d39574d77d6b6b358fff8d3e35e4e3"} Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.904870 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"8a778954d1374877671fc9bca86b7581cc1911487a943aba7bf61952dec5e818"} Mar 20 11:08:52 crc kubenswrapper[4860]: I0320 11:08:52.904894 4860 scope.go:117] "RemoveContainer" containerID="bf060ef062b20409d88a33daa69a9b6db0709a5d9a572c42e386c266f6d32bae" Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.491170 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nbkmw"] Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.492466 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-controller" containerID="cri-o://0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.492554 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="nbdb" containerID="cri-o://903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.492587 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.492635 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-acl-logging" containerID="cri-o://a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.492587 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="northd" containerID="cri-o://f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.492617 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-node" containerID="cri-o://9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.492893 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="sbdb" containerID="cri-o://9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.632118 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" containerID="cri-o://4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" gracePeriod=30 Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.944514 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/3.log" Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.947561 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovn-acl-logging/0.log" Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.948070 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovn-controller/0.log" Mar 20 11:09:50 crc kubenswrapper[4860]: I0320 11:09:50.948658 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018150 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dbgf2"] Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018613 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018638 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018657 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-node" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018671 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-node" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018696 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="northd" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018710 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="northd" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018727 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018739 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018755 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018768 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018785 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kubecfg-setup" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018797 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kubecfg-setup" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018813 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018856 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018870 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-acl-logging" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018882 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-acl-logging" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018900 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="sbdb" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018912 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="sbdb" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018930 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d05f5e64-f0ec-45f9-a491-7dde7bdf6538" containerName="oc" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018944 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05f5e64-f0ec-45f9-a491-7dde7bdf6538" containerName="oc" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018963 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.018975 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.018989 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="nbdb" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019001 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="nbdb" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019170 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="sbdb" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019194 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="d05f5e64-f0ec-45f9-a491-7dde7bdf6538" containerName="oc" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019217 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019370 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="nbdb" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019386 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019400 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019417 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-acl-logging" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019435 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="northd" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019455 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019467 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="kube-rbac-proxy-node" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019483 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovn-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019495 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.019657 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019671 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.019838 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.020003 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.020028 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerName="ovnkube-controller" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.022948 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045247 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9btp\" (UniqueName: \"kubernetes.io/projected/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-kube-api-access-l9btp\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045364 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-etc-openvswitch\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045411 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-ovn-kubernetes\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045440 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-systemd\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045507 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-script-lib\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045554 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-env-overrides\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045597 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-bin\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045642 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovn-node-metrics-cert\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045674 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-log-socket\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045724 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-node-log\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045764 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-systemd-units\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045792 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-ovn\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045834 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-slash\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045865 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045896 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-netns\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045929 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-config\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.045973 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-openvswitch\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046027 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-var-lib-openvswitch\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046061 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-kubelet\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046093 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-netd\") pod \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\" (UID: \"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f\") " Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046365 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-slash\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046408 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-log-socket\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046442 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-node-log\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046488 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovnkube-script-lib\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046528 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6gvr\" (UniqueName: \"kubernetes.io/projected/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-kube-api-access-m6gvr\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046561 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-cni-netd\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046594 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-run-netns\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046627 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-var-lib-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046660 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovn-node-metrics-cert\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046697 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-kubelet\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046738 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovnkube-config\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046806 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046840 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-etc-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046872 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046903 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-run-ovn-kubernetes\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046945 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-systemd-units\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.046979 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-systemd\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047015 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-env-overrides\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047044 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-ovn\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047075 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-cni-bin\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047173 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047207 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047268 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047269 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047296 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047317 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047354 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047540 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-node-log" (OuterVolumeSpecName: "node-log") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.047571 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048103 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-log-socket" (OuterVolumeSpecName: "log-socket") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048147 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-slash" (OuterVolumeSpecName: "host-slash") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048174 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048199 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048201 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048310 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048420 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.048454 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.055248 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-kube-api-access-l9btp" (OuterVolumeSpecName: "kube-api-access-l9btp") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "kube-api-access-l9btp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.057945 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.065916 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" (UID: "eb85f6f9-1c0f-4388-9464-25dfe48d8d0f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148032 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-kubelet\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148112 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovnkube-config\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148188 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148200 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-kubelet\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148254 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-etc-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148291 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148320 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-run-ovn-kubernetes\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148359 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-systemd-units\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148394 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-systemd\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148429 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-env-overrides\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148458 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-ovn\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148490 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-cni-bin\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148522 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-slash\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148548 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-log-socket\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148577 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-node-log\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148616 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovnkube-script-lib\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148653 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6gvr\" (UniqueName: \"kubernetes.io/projected/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-kube-api-access-m6gvr\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148683 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-cni-netd\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148716 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-run-netns\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148747 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-var-lib-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148777 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovn-node-metrics-cert\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148843 4860 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148868 4860 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148886 4860 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148904 4860 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148922 4860 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148938 4860 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148956 4860 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148975 4860 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.148991 4860 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149009 4860 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149025 4860 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149045 4860 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149063 4860 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149081 4860 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149533 4860 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150297 4860 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150311 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-cni-bin\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149212 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-cni-netd\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149281 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-etc-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149341 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-run-netns\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150404 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9btp\" (UniqueName: \"kubernetes.io/projected/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-kube-api-access-l9btp\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149374 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-var-lib-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150430 4860 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150478 4860 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149637 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovnkube-config\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150497 4860 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149711 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-node-log\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149674 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-log-socket\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149723 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-ovn\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149173 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-run-ovn-kubernetes\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149313 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-openvswitch\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149429 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-systemd-units\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150656 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovnkube-script-lib\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149217 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149680 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-host-slash\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.150212 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-env-overrides\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.149402 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-run-systemd\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.156433 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-ovn-node-metrics-cert\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.166874 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6gvr\" (UniqueName: \"kubernetes.io/projected/fdde4a9b-30f5-42b1-8f84-d913cea8ddc8-kube-api-access-m6gvr\") pod \"ovnkube-node-dbgf2\" (UID: \"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.267865 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovnkube-controller/3.log" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.273053 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovn-acl-logging/0.log" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.273650 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nbkmw_eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/ovn-controller/0.log" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275033 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" exitCode=0 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275070 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" exitCode=0 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275078 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" exitCode=0 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275085 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" exitCode=0 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275093 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" exitCode=0 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275082 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275142 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275157 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275168 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275100 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" exitCode=0 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275171 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275209 4860 scope.go:117] "RemoveContainer" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275205 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" exitCode=143 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275251 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" containerID="0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" exitCode=143 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275186 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275413 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275456 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275476 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275483 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275489 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275495 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275501 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275507 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275520 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275526 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275535 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275545 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275552 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275558 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275567 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275572 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275578 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275586 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275594 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275600 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275606 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275613 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275622 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275629 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275634 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275640 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275645 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275650 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275656 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275661 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275666 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275673 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275681 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nbkmw" event={"ID":"eb85f6f9-1c0f-4388-9464-25dfe48d8d0f","Type":"ContainerDied","Data":"f02c2b7cd3b6fd372807cb3bef1e94c63376211be0ef5968a6850857b9722fe7"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275689 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275696 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275702 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275708 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275714 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275720 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275725 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275730 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275736 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.275741 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.278693 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/2.log" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.279870 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/1.log" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.279973 4860 generic.go:334] "Generic (PLEG): container finished" podID="a89c8af2-338f-401f-aad5-c6d7763a3b3a" containerID="bbf0bd8dd1e8efce7a65cd6499f4e5d67e95f7c0af27658c16d6dad07affb764" exitCode=2 Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.280068 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerDied","Data":"bbf0bd8dd1e8efce7a65cd6499f4e5d67e95f7c0af27658c16d6dad07affb764"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.280121 4860 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e"} Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.280706 4860 scope.go:117] "RemoveContainer" containerID="bbf0bd8dd1e8efce7a65cd6499f4e5d67e95f7c0af27658c16d6dad07affb764" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.306511 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.333354 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nbkmw"] Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.334211 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nbkmw"] Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.337965 4860 scope.go:117] "RemoveContainer" containerID="9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.346099 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.367141 4860 scope.go:117] "RemoveContainer" containerID="903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.389986 4860 scope.go:117] "RemoveContainer" containerID="f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.417854 4860 scope.go:117] "RemoveContainer" containerID="9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.422110 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb85f6f9-1c0f-4388-9464-25dfe48d8d0f" path="/var/lib/kubelet/pods/eb85f6f9-1c0f-4388-9464-25dfe48d8d0f/volumes" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.438240 4860 scope.go:117] "RemoveContainer" containerID="9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.482036 4860 scope.go:117] "RemoveContainer" containerID="a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.510106 4860 scope.go:117] "RemoveContainer" containerID="0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.545456 4860 scope.go:117] "RemoveContainer" containerID="95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.606265 4860 scope.go:117] "RemoveContainer" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.606946 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": container with ID starting with 4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6 not found: ID does not exist" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.607005 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} err="failed to get container status \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": rpc error: code = NotFound desc = could not find container \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": container with ID starting with 4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.607043 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.607419 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": container with ID starting with 8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049 not found: ID does not exist" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.607457 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} err="failed to get container status \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": rpc error: code = NotFound desc = could not find container \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": container with ID starting with 8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.607486 4860 scope.go:117] "RemoveContainer" containerID="9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.607748 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": container with ID starting with 9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9 not found: ID does not exist" containerID="9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.607782 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} err="failed to get container status \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": rpc error: code = NotFound desc = could not find container \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": container with ID starting with 9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.607811 4860 scope.go:117] "RemoveContainer" containerID="903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.608035 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": container with ID starting with 903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf not found: ID does not exist" containerID="903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.608064 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} err="failed to get container status \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": rpc error: code = NotFound desc = could not find container \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": container with ID starting with 903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.608079 4860 scope.go:117] "RemoveContainer" containerID="f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.608362 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": container with ID starting with f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f not found: ID does not exist" containerID="f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.608404 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} err="failed to get container status \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": rpc error: code = NotFound desc = could not find container \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": container with ID starting with f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.608430 4860 scope.go:117] "RemoveContainer" containerID="9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.608664 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": container with ID starting with 9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18 not found: ID does not exist" containerID="9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.608695 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} err="failed to get container status \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": rpc error: code = NotFound desc = could not find container \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": container with ID starting with 9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.608712 4860 scope.go:117] "RemoveContainer" containerID="9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.609127 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": container with ID starting with 9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793 not found: ID does not exist" containerID="9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.609150 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} err="failed to get container status \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": rpc error: code = NotFound desc = could not find container \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": container with ID starting with 9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.609163 4860 scope.go:117] "RemoveContainer" containerID="a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.609413 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": container with ID starting with a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e not found: ID does not exist" containerID="a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.609434 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} err="failed to get container status \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": rpc error: code = NotFound desc = could not find container \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": container with ID starting with a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.609454 4860 scope.go:117] "RemoveContainer" containerID="0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.609943 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": container with ID starting with 0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d not found: ID does not exist" containerID="0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.609973 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} err="failed to get container status \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": rpc error: code = NotFound desc = could not find container \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": container with ID starting with 0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.609992 4860 scope.go:117] "RemoveContainer" containerID="95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11" Mar 20 11:09:51 crc kubenswrapper[4860]: E0320 11:09:51.610210 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": container with ID starting with 95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11 not found: ID does not exist" containerID="95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610246 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} err="failed to get container status \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": rpc error: code = NotFound desc = could not find container \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": container with ID starting with 95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610262 4860 scope.go:117] "RemoveContainer" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610442 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} err="failed to get container status \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": rpc error: code = NotFound desc = could not find container \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": container with ID starting with 4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610462 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610701 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} err="failed to get container status \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": rpc error: code = NotFound desc = could not find container \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": container with ID starting with 8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610724 4860 scope.go:117] "RemoveContainer" containerID="9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610935 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} err="failed to get container status \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": rpc error: code = NotFound desc = could not find container \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": container with ID starting with 9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.610962 4860 scope.go:117] "RemoveContainer" containerID="903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.611212 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} err="failed to get container status \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": rpc error: code = NotFound desc = could not find container \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": container with ID starting with 903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.611257 4860 scope.go:117] "RemoveContainer" containerID="f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.611477 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} err="failed to get container status \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": rpc error: code = NotFound desc = could not find container \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": container with ID starting with f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.611499 4860 scope.go:117] "RemoveContainer" containerID="9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.612124 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} err="failed to get container status \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": rpc error: code = NotFound desc = could not find container \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": container with ID starting with 9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.612153 4860 scope.go:117] "RemoveContainer" containerID="9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.612407 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} err="failed to get container status \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": rpc error: code = NotFound desc = could not find container \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": container with ID starting with 9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.612441 4860 scope.go:117] "RemoveContainer" containerID="a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.612879 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} err="failed to get container status \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": rpc error: code = NotFound desc = could not find container \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": container with ID starting with a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.612900 4860 scope.go:117] "RemoveContainer" containerID="0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613131 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} err="failed to get container status \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": rpc error: code = NotFound desc = could not find container \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": container with ID starting with 0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613162 4860 scope.go:117] "RemoveContainer" containerID="95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613390 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} err="failed to get container status \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": rpc error: code = NotFound desc = could not find container \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": container with ID starting with 95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613422 4860 scope.go:117] "RemoveContainer" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613636 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} err="failed to get container status \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": rpc error: code = NotFound desc = could not find container \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": container with ID starting with 4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613665 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613868 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} err="failed to get container status \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": rpc error: code = NotFound desc = could not find container \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": container with ID starting with 8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.613890 4860 scope.go:117] "RemoveContainer" containerID="9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614054 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} err="failed to get container status \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": rpc error: code = NotFound desc = could not find container \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": container with ID starting with 9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614078 4860 scope.go:117] "RemoveContainer" containerID="903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614302 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} err="failed to get container status \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": rpc error: code = NotFound desc = could not find container \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": container with ID starting with 903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614322 4860 scope.go:117] "RemoveContainer" containerID="f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614530 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} err="failed to get container status \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": rpc error: code = NotFound desc = could not find container \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": container with ID starting with f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614560 4860 scope.go:117] "RemoveContainer" containerID="9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614892 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} err="failed to get container status \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": rpc error: code = NotFound desc = could not find container \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": container with ID starting with 9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.614915 4860 scope.go:117] "RemoveContainer" containerID="9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615139 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} err="failed to get container status \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": rpc error: code = NotFound desc = could not find container \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": container with ID starting with 9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615164 4860 scope.go:117] "RemoveContainer" containerID="a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615443 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} err="failed to get container status \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": rpc error: code = NotFound desc = could not find container \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": container with ID starting with a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615472 4860 scope.go:117] "RemoveContainer" containerID="0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615745 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} err="failed to get container status \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": rpc error: code = NotFound desc = could not find container \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": container with ID starting with 0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615766 4860 scope.go:117] "RemoveContainer" containerID="95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615966 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} err="failed to get container status \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": rpc error: code = NotFound desc = could not find container \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": container with ID starting with 95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.615988 4860 scope.go:117] "RemoveContainer" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616190 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} err="failed to get container status \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": rpc error: code = NotFound desc = could not find container \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": container with ID starting with 4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616209 4860 scope.go:117] "RemoveContainer" containerID="8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616458 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049"} err="failed to get container status \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": rpc error: code = NotFound desc = could not find container \"8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049\": container with ID starting with 8b2175e8b0f3d0a1b8ef8642ce83c4b05c7ce1f29d0f3417822f71330a375049 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616476 4860 scope.go:117] "RemoveContainer" containerID="9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616648 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9"} err="failed to get container status \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": rpc error: code = NotFound desc = could not find container \"9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9\": container with ID starting with 9db5160a896b3335c9d52bf8e90d3162f2af00c8377625507d59ad11b2938ae9 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616664 4860 scope.go:117] "RemoveContainer" containerID="903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616871 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf"} err="failed to get container status \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": rpc error: code = NotFound desc = could not find container \"903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf\": container with ID starting with 903989cd0cc72abfd7ab476660747c74658fbad07bd3c92cf1cdaa2533df17bf not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.616894 4860 scope.go:117] "RemoveContainer" containerID="f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617073 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f"} err="failed to get container status \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": rpc error: code = NotFound desc = could not find container \"f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f\": container with ID starting with f4e8db06bda4cff6adfda647ad9a184ca87be61aa350dcc7d19aefaddaf7056f not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617090 4860 scope.go:117] "RemoveContainer" containerID="9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617276 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18"} err="failed to get container status \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": rpc error: code = NotFound desc = could not find container \"9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18\": container with ID starting with 9b3f7f07bfe4e53c14cb6a5e61d60ce5d4a309a64c93cf1121d3373b2ff27e18 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617295 4860 scope.go:117] "RemoveContainer" containerID="9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617513 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793"} err="failed to get container status \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": rpc error: code = NotFound desc = could not find container \"9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793\": container with ID starting with 9e1fdae91c3fc2c8385d89bdfadfb566c2bec660c3991720a695d2c0790f6793 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617535 4860 scope.go:117] "RemoveContainer" containerID="a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617778 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e"} err="failed to get container status \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": rpc error: code = NotFound desc = could not find container \"a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e\": container with ID starting with a84809569015510188ed6aee8bb1ab1ef109ae1b7cbd5800bad779ddde3ec92e not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.617802 4860 scope.go:117] "RemoveContainer" containerID="0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.618048 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d"} err="failed to get container status \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": rpc error: code = NotFound desc = could not find container \"0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d\": container with ID starting with 0418ad0ba5b132db2e63d9263e6db6bc9dfd4f2cff92e44195e122658bea920d not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.618070 4860 scope.go:117] "RemoveContainer" containerID="95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.618309 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11"} err="failed to get container status \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": rpc error: code = NotFound desc = could not find container \"95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11\": container with ID starting with 95cfa536099d3d124f8abf90e2bcf1f06688ca6e70317d32c385617e7e238c11 not found: ID does not exist" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.618340 4860 scope.go:117] "RemoveContainer" containerID="4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6" Mar 20 11:09:51 crc kubenswrapper[4860]: I0320 11:09:51.618561 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6"} err="failed to get container status \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": rpc error: code = NotFound desc = could not find container \"4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6\": container with ID starting with 4f5a30ad09d458bec4ede239893e4825bfe7b1e45f543e00827124da5aa465f6 not found: ID does not exist" Mar 20 11:09:52 crc kubenswrapper[4860]: I0320 11:09:52.315793 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/2.log" Mar 20 11:09:52 crc kubenswrapper[4860]: I0320 11:09:52.317746 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/1.log" Mar 20 11:09:52 crc kubenswrapper[4860]: I0320 11:09:52.317827 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmc44" event={"ID":"a89c8af2-338f-401f-aad5-c6d7763a3b3a","Type":"ContainerStarted","Data":"7c703638b7464205d064c1a5b6a628f2894ada53a1c2e318b74addf7a4cc0084"} Mar 20 11:09:52 crc kubenswrapper[4860]: I0320 11:09:52.321670 4860 generic.go:334] "Generic (PLEG): container finished" podID="fdde4a9b-30f5-42b1-8f84-d913cea8ddc8" containerID="52d88f1b9c02de4f8c5cab45a771fc2befc2a853cd5323af863096c67fea59eb" exitCode=0 Mar 20 11:09:52 crc kubenswrapper[4860]: I0320 11:09:52.321733 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerDied","Data":"52d88f1b9c02de4f8c5cab45a771fc2befc2a853cd5323af863096c67fea59eb"} Mar 20 11:09:52 crc kubenswrapper[4860]: I0320 11:09:52.321769 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"02f04e9c289e46cd553ca2efd780e95ad4b4557fff980223eedbc54acb01029d"} Mar 20 11:09:53 crc kubenswrapper[4860]: I0320 11:09:53.341323 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"bc40f58fb5bcf926371f71ee5e80772f04da47e542e7a49fe7413b7e508c8c04"} Mar 20 11:09:53 crc kubenswrapper[4860]: I0320 11:09:53.342025 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"a6a6b3d2a971733c4b2f82419ebecf7a2d34500d575921ba4c12307aa3024e00"} Mar 20 11:09:53 crc kubenswrapper[4860]: I0320 11:09:53.342038 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"2d5efb607715bd0c42854e8ae2338908f90853fadc4ab20d761bdd4ec974649d"} Mar 20 11:09:53 crc kubenswrapper[4860]: I0320 11:09:53.342047 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"075796bbba6e7b4c490fe7e2252b847f258dea25f9078e7e5ce42fa3c844b6a2"} Mar 20 11:09:53 crc kubenswrapper[4860]: I0320 11:09:53.342056 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"6f70af02015847e71789804e8346eea16be92700864b6ada1b5c6fbcc8fa9e6a"} Mar 20 11:09:53 crc kubenswrapper[4860]: I0320 11:09:53.342064 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"516cb1dd67c70f5a3925ba5b7ff2619cd2629fd2d506cea8c8bc76b04bc2d491"} Mar 20 11:09:55 crc kubenswrapper[4860]: I0320 11:09:55.359533 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"0d5c317356089b76000465b1cf9a8c1fd521aaadf094be271294e8d5e4f0ff19"} Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.306406 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-g98c9"] Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.308247 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.310804 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.311153 4860 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-vn2jz" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.311322 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.312264 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.355892 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-node-mnt\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.356008 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-crc-storage\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.356079 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qw9d\" (UniqueName: \"kubernetes.io/projected/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-kube-api-access-8qw9d\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.386888 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" event={"ID":"fdde4a9b-30f5-42b1-8f84-d913cea8ddc8","Type":"ContainerStarted","Data":"b0f1e383251f12ec61e4e27889258b8f20deff78aecf28fa42474323abad2c07"} Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.387301 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.424206 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.427196 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" podStartSLOduration=8.427174249 podStartE2EDuration="8.427174249s" podCreationTimestamp="2026-03-20 11:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:09:58.423124038 +0000 UTC m=+922.644484956" watchObservedRunningTime="2026-03-20 11:09:58.427174249 +0000 UTC m=+922.648535147" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.458089 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qw9d\" (UniqueName: \"kubernetes.io/projected/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-kube-api-access-8qw9d\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.458267 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-node-mnt\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.458369 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-crc-storage\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.460087 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-node-mnt\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.460757 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-crc-storage\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.483713 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qw9d\" (UniqueName: \"kubernetes.io/projected/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-kube-api-access-8qw9d\") pod \"crc-storage-crc-g98c9\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: I0320 11:09:58.628791 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: E0320 11:09:58.667020 4860 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(5ce6a54ea35dc7f5bd5fa36a6e600ad1429c6b162ccad3e02e6d432ffc563e9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 11:09:58 crc kubenswrapper[4860]: E0320 11:09:58.667118 4860 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(5ce6a54ea35dc7f5bd5fa36a6e600ad1429c6b162ccad3e02e6d432ffc563e9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: E0320 11:09:58.667144 4860 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(5ce6a54ea35dc7f5bd5fa36a6e600ad1429c6b162ccad3e02e6d432ffc563e9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:58 crc kubenswrapper[4860]: E0320 11:09:58.667198 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-g98c9_crc-storage(09260d7e-28fd-4e8f-be6f-9b7df7c9d345)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-g98c9_crc-storage(09260d7e-28fd-4e8f-be6f-9b7df7c9d345)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(5ce6a54ea35dc7f5bd5fa36a6e600ad1429c6b162ccad3e02e6d432ffc563e9f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-g98c9" podUID="09260d7e-28fd-4e8f-be6f-9b7df7c9d345" Mar 20 11:09:59 crc kubenswrapper[4860]: I0320 11:09:59.332079 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-g98c9"] Mar 20 11:09:59 crc kubenswrapper[4860]: I0320 11:09:59.392923 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:59 crc kubenswrapper[4860]: I0320 11:09:59.393525 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:59 crc kubenswrapper[4860]: I0320 11:09:59.393583 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:59 crc kubenswrapper[4860]: I0320 11:09:59.393644 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:09:59 crc kubenswrapper[4860]: E0320 11:09:59.434731 4860 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(abc20e4455bcf76ea67fb61ddab0074aed042ee93a8fa06ec9a40b00751c1f7f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 11:09:59 crc kubenswrapper[4860]: E0320 11:09:59.434815 4860 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(abc20e4455bcf76ea67fb61ddab0074aed042ee93a8fa06ec9a40b00751c1f7f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:59 crc kubenswrapper[4860]: E0320 11:09:59.434845 4860 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(abc20e4455bcf76ea67fb61ddab0074aed042ee93a8fa06ec9a40b00751c1f7f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:09:59 crc kubenswrapper[4860]: E0320 11:09:59.434913 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-g98c9_crc-storage(09260d7e-28fd-4e8f-be6f-9b7df7c9d345)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-g98c9_crc-storage(09260d7e-28fd-4e8f-be6f-9b7df7c9d345)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-g98c9_crc-storage_09260d7e-28fd-4e8f-be6f-9b7df7c9d345_0(abc20e4455bcf76ea67fb61ddab0074aed042ee93a8fa06ec9a40b00751c1f7f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-g98c9" podUID="09260d7e-28fd-4e8f-be6f-9b7df7c9d345" Mar 20 11:09:59 crc kubenswrapper[4860]: I0320 11:09:59.438713 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.135454 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566750-bgxvq"] Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.136720 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.140468 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.140589 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.140832 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.147071 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-bgxvq"] Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.185532 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjsbb\" (UniqueName: \"kubernetes.io/projected/b45dae17-b8e6-4d57-a525-2892e7ff37f7-kube-api-access-pjsbb\") pod \"auto-csr-approver-29566750-bgxvq\" (UID: \"b45dae17-b8e6-4d57-a525-2892e7ff37f7\") " pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.287300 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjsbb\" (UniqueName: \"kubernetes.io/projected/b45dae17-b8e6-4d57-a525-2892e7ff37f7-kube-api-access-pjsbb\") pod \"auto-csr-approver-29566750-bgxvq\" (UID: \"b45dae17-b8e6-4d57-a525-2892e7ff37f7\") " pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.315814 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjsbb\" (UniqueName: \"kubernetes.io/projected/b45dae17-b8e6-4d57-a525-2892e7ff37f7-kube-api-access-pjsbb\") pod \"auto-csr-approver-29566750-bgxvq\" (UID: \"b45dae17-b8e6-4d57-a525-2892e7ff37f7\") " pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:00 crc kubenswrapper[4860]: I0320 11:10:00.455299 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:00 crc kubenswrapper[4860]: E0320 11:10:00.501324 4860 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(ad242c905983cb71f20cf0e37431ef937d614d9396ea0adf6a6ac026cf6f95b5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 11:10:00 crc kubenswrapper[4860]: E0320 11:10:00.501840 4860 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(ad242c905983cb71f20cf0e37431ef937d614d9396ea0adf6a6ac026cf6f95b5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:00 crc kubenswrapper[4860]: E0320 11:10:00.501871 4860 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(ad242c905983cb71f20cf0e37431ef937d614d9396ea0adf6a6ac026cf6f95b5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:00 crc kubenswrapper[4860]: E0320 11:10:00.501946 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29566750-bgxvq_openshift-infra(b45dae17-b8e6-4d57-a525-2892e7ff37f7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29566750-bgxvq_openshift-infra(b45dae17-b8e6-4d57-a525-2892e7ff37f7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(ad242c905983cb71f20cf0e37431ef937d614d9396ea0adf6a6ac026cf6f95b5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" podUID="b45dae17-b8e6-4d57-a525-2892e7ff37f7" Mar 20 11:10:01 crc kubenswrapper[4860]: I0320 11:10:01.404491 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:01 crc kubenswrapper[4860]: I0320 11:10:01.405597 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:01 crc kubenswrapper[4860]: E0320 11:10:01.434178 4860 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(88f90069a7d42f6dc0026f46e01d212a855604ce260bf23f659b61e8f16ca249): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 11:10:01 crc kubenswrapper[4860]: E0320 11:10:01.434279 4860 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(88f90069a7d42f6dc0026f46e01d212a855604ce260bf23f659b61e8f16ca249): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:01 crc kubenswrapper[4860]: E0320 11:10:01.434303 4860 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(88f90069a7d42f6dc0026f46e01d212a855604ce260bf23f659b61e8f16ca249): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:01 crc kubenswrapper[4860]: E0320 11:10:01.434359 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29566750-bgxvq_openshift-infra(b45dae17-b8e6-4d57-a525-2892e7ff37f7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29566750-bgxvq_openshift-infra(b45dae17-b8e6-4d57-a525-2892e7ff37f7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566750-bgxvq_openshift-infra_b45dae17-b8e6-4d57-a525-2892e7ff37f7_0(88f90069a7d42f6dc0026f46e01d212a855604ce260bf23f659b61e8f16ca249): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" podUID="b45dae17-b8e6-4d57-a525-2892e7ff37f7" Mar 20 11:10:10 crc kubenswrapper[4860]: I0320 11:10:10.412656 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:10:10 crc kubenswrapper[4860]: I0320 11:10:10.415545 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:10:10 crc kubenswrapper[4860]: I0320 11:10:10.625977 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-g98c9"] Mar 20 11:10:10 crc kubenswrapper[4860]: I0320 11:10:10.635047 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:10:11 crc kubenswrapper[4860]: I0320 11:10:11.471893 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-g98c9" event={"ID":"09260d7e-28fd-4e8f-be6f-9b7df7c9d345","Type":"ContainerStarted","Data":"63a4a2b38cf6b489d8cdfb9fb896b9060ca1337daad5de94707bfdf27b00375c"} Mar 20 11:10:12 crc kubenswrapper[4860]: I0320 11:10:12.479878 4860 generic.go:334] "Generic (PLEG): container finished" podID="09260d7e-28fd-4e8f-be6f-9b7df7c9d345" containerID="8bc4373ffa212d4a837568628c26dd14b62ba6066892f6d72ca8bf7d4caad612" exitCode=0 Mar 20 11:10:12 crc kubenswrapper[4860]: I0320 11:10:12.479928 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-g98c9" event={"ID":"09260d7e-28fd-4e8f-be6f-9b7df7c9d345","Type":"ContainerDied","Data":"8bc4373ffa212d4a837568628c26dd14b62ba6066892f6d72ca8bf7d4caad612"} Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.727908 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.778092 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-crc-storage\") pod \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.778282 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-node-mnt\") pod \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.778357 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qw9d\" (UniqueName: \"kubernetes.io/projected/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-kube-api-access-8qw9d\") pod \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\" (UID: \"09260d7e-28fd-4e8f-be6f-9b7df7c9d345\") " Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.779080 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "09260d7e-28fd-4e8f-be6f-9b7df7c9d345" (UID: "09260d7e-28fd-4e8f-be6f-9b7df7c9d345"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.785832 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-kube-api-access-8qw9d" (OuterVolumeSpecName: "kube-api-access-8qw9d") pod "09260d7e-28fd-4e8f-be6f-9b7df7c9d345" (UID: "09260d7e-28fd-4e8f-be6f-9b7df7c9d345"). InnerVolumeSpecName "kube-api-access-8qw9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.795839 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "09260d7e-28fd-4e8f-be6f-9b7df7c9d345" (UID: "09260d7e-28fd-4e8f-be6f-9b7df7c9d345"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.879782 4860 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.879826 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qw9d\" (UniqueName: \"kubernetes.io/projected/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-kube-api-access-8qw9d\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:13 crc kubenswrapper[4860]: I0320 11:10:13.879839 4860 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/09260d7e-28fd-4e8f-be6f-9b7df7c9d345-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:14 crc kubenswrapper[4860]: I0320 11:10:14.413964 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:14 crc kubenswrapper[4860]: I0320 11:10:14.415489 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:14 crc kubenswrapper[4860]: I0320 11:10:14.494724 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-g98c9" event={"ID":"09260d7e-28fd-4e8f-be6f-9b7df7c9d345","Type":"ContainerDied","Data":"63a4a2b38cf6b489d8cdfb9fb896b9060ca1337daad5de94707bfdf27b00375c"} Mar 20 11:10:14 crc kubenswrapper[4860]: I0320 11:10:14.494779 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63a4a2b38cf6b489d8cdfb9fb896b9060ca1337daad5de94707bfdf27b00375c" Mar 20 11:10:14 crc kubenswrapper[4860]: I0320 11:10:14.494859 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-g98c9" Mar 20 11:10:14 crc kubenswrapper[4860]: I0320 11:10:14.609006 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-bgxvq"] Mar 20 11:10:14 crc kubenswrapper[4860]: W0320 11:10:14.617848 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb45dae17_b8e6_4d57_a525_2892e7ff37f7.slice/crio-000d571be4adbaa240194972d81bb16e55a829f676ceb0a0e5868075f3df2036 WatchSource:0}: Error finding container 000d571be4adbaa240194972d81bb16e55a829f676ceb0a0e5868075f3df2036: Status 404 returned error can't find the container with id 000d571be4adbaa240194972d81bb16e55a829f676ceb0a0e5868075f3df2036 Mar 20 11:10:15 crc kubenswrapper[4860]: I0320 11:10:15.502687 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" event={"ID":"b45dae17-b8e6-4d57-a525-2892e7ff37f7","Type":"ContainerStarted","Data":"000d571be4adbaa240194972d81bb16e55a829f676ceb0a0e5868075f3df2036"} Mar 20 11:10:16 crc kubenswrapper[4860]: I0320 11:10:16.510367 4860 generic.go:334] "Generic (PLEG): container finished" podID="b45dae17-b8e6-4d57-a525-2892e7ff37f7" containerID="bb236d0c90b35c798ab0b91ca64ed98eb462e09d8cbe538c6779b53064938615" exitCode=0 Mar 20 11:10:16 crc kubenswrapper[4860]: I0320 11:10:16.510415 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" event={"ID":"b45dae17-b8e6-4d57-a525-2892e7ff37f7","Type":"ContainerDied","Data":"bb236d0c90b35c798ab0b91ca64ed98eb462e09d8cbe538c6779b53064938615"} Mar 20 11:10:17 crc kubenswrapper[4860]: I0320 11:10:17.751572 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:17 crc kubenswrapper[4860]: I0320 11:10:17.832751 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjsbb\" (UniqueName: \"kubernetes.io/projected/b45dae17-b8e6-4d57-a525-2892e7ff37f7-kube-api-access-pjsbb\") pod \"b45dae17-b8e6-4d57-a525-2892e7ff37f7\" (UID: \"b45dae17-b8e6-4d57-a525-2892e7ff37f7\") " Mar 20 11:10:17 crc kubenswrapper[4860]: I0320 11:10:17.841112 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b45dae17-b8e6-4d57-a525-2892e7ff37f7-kube-api-access-pjsbb" (OuterVolumeSpecName: "kube-api-access-pjsbb") pod "b45dae17-b8e6-4d57-a525-2892e7ff37f7" (UID: "b45dae17-b8e6-4d57-a525-2892e7ff37f7"). InnerVolumeSpecName "kube-api-access-pjsbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:17 crc kubenswrapper[4860]: I0320 11:10:17.935117 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjsbb\" (UniqueName: \"kubernetes.io/projected/b45dae17-b8e6-4d57-a525-2892e7ff37f7-kube-api-access-pjsbb\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:18 crc kubenswrapper[4860]: I0320 11:10:18.525585 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" event={"ID":"b45dae17-b8e6-4d57-a525-2892e7ff37f7","Type":"ContainerDied","Data":"000d571be4adbaa240194972d81bb16e55a829f676ceb0a0e5868075f3df2036"} Mar 20 11:10:18 crc kubenswrapper[4860]: I0320 11:10:18.525630 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="000d571be4adbaa240194972d81bb16e55a829f676ceb0a0e5868075f3df2036" Mar 20 11:10:18 crc kubenswrapper[4860]: I0320 11:10:18.525655 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-bgxvq" Mar 20 11:10:18 crc kubenswrapper[4860]: I0320 11:10:18.826824 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-9jlnw"] Mar 20 11:10:18 crc kubenswrapper[4860]: I0320 11:10:18.832819 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-9jlnw"] Mar 20 11:10:19 crc kubenswrapper[4860]: I0320 11:10:19.421423 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d72f65a1-efc6-45f5-a056-d64ee5bce755" path="/var/lib/kubelet/pods/d72f65a1-efc6-45f5-a056-d64ee5bce755/volumes" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.618104 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd"] Mar 20 11:10:20 crc kubenswrapper[4860]: E0320 11:10:20.618952 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09260d7e-28fd-4e8f-be6f-9b7df7c9d345" containerName="storage" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.618972 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="09260d7e-28fd-4e8f-be6f-9b7df7c9d345" containerName="storage" Mar 20 11:10:20 crc kubenswrapper[4860]: E0320 11:10:20.618993 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45dae17-b8e6-4d57-a525-2892e7ff37f7" containerName="oc" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.619001 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45dae17-b8e6-4d57-a525-2892e7ff37f7" containerName="oc" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.619121 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="09260d7e-28fd-4e8f-be6f-9b7df7c9d345" containerName="storage" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.619138 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45dae17-b8e6-4d57-a525-2892e7ff37f7" containerName="oc" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.620093 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.622687 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.629830 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd"] Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.673264 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.673321 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.673435 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhxht\" (UniqueName: \"kubernetes.io/projected/eb4de49a-fca6-4c8c-8484-461859f95884-kube-api-access-xhxht\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.775176 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhxht\" (UniqueName: \"kubernetes.io/projected/eb4de49a-fca6-4c8c-8484-461859f95884-kube-api-access-xhxht\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.775271 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.775312 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.775931 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.776130 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.794289 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhxht\" (UniqueName: \"kubernetes.io/projected/eb4de49a-fca6-4c8c-8484-461859f95884-kube-api-access-xhxht\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:20 crc kubenswrapper[4860]: I0320 11:10:20.944297 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:21 crc kubenswrapper[4860]: I0320 11:10:21.157672 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd"] Mar 20 11:10:21 crc kubenswrapper[4860]: W0320 11:10:21.164473 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb4de49a_fca6_4c8c_8484_461859f95884.slice/crio-ec54a7908fbfc0ba4a1a04b89bb324101104b953442cebe94810ba54c04ca6dd WatchSource:0}: Error finding container ec54a7908fbfc0ba4a1a04b89bb324101104b953442cebe94810ba54c04ca6dd: Status 404 returned error can't find the container with id ec54a7908fbfc0ba4a1a04b89bb324101104b953442cebe94810ba54c04ca6dd Mar 20 11:10:21 crc kubenswrapper[4860]: I0320 11:10:21.374366 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dbgf2" Mar 20 11:10:21 crc kubenswrapper[4860]: I0320 11:10:21.545586 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" event={"ID":"eb4de49a-fca6-4c8c-8484-461859f95884","Type":"ContainerStarted","Data":"9e545cf0d8db4a6df524f343b19b1c45cdf0a040ca55b4d3380d49fe8f3dd5fd"} Mar 20 11:10:21 crc kubenswrapper[4860]: I0320 11:10:21.546503 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" event={"ID":"eb4de49a-fca6-4c8c-8484-461859f95884","Type":"ContainerStarted","Data":"ec54a7908fbfc0ba4a1a04b89bb324101104b953442cebe94810ba54c04ca6dd"} Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.552954 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb4de49a-fca6-4c8c-8484-461859f95884" containerID="9e545cf0d8db4a6df524f343b19b1c45cdf0a040ca55b4d3380d49fe8f3dd5fd" exitCode=0 Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.553018 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" event={"ID":"eb4de49a-fca6-4c8c-8484-461859f95884","Type":"ContainerDied","Data":"9e545cf0d8db4a6df524f343b19b1c45cdf0a040ca55b4d3380d49fe8f3dd5fd"} Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.735093 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b9j9j"] Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.736978 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.751997 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b9j9j"] Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.805557 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7tgs\" (UniqueName: \"kubernetes.io/projected/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-kube-api-access-q7tgs\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.805638 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-utilities\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.805668 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-catalog-content\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.906579 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7tgs\" (UniqueName: \"kubernetes.io/projected/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-kube-api-access-q7tgs\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.906648 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-utilities\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.906670 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-catalog-content\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.907637 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-catalog-content\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.907990 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-utilities\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:22 crc kubenswrapper[4860]: I0320 11:10:22.940315 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7tgs\" (UniqueName: \"kubernetes.io/projected/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-kube-api-access-q7tgs\") pod \"redhat-operators-b9j9j\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:23 crc kubenswrapper[4860]: I0320 11:10:23.056266 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:23 crc kubenswrapper[4860]: I0320 11:10:23.287651 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b9j9j"] Mar 20 11:10:23 crc kubenswrapper[4860]: I0320 11:10:23.561137 4860 generic.go:334] "Generic (PLEG): container finished" podID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerID="6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364" exitCode=0 Mar 20 11:10:23 crc kubenswrapper[4860]: I0320 11:10:23.561197 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9j9j" event={"ID":"be396660-5e1e-4dfe-9a08-26b2fcc69a0a","Type":"ContainerDied","Data":"6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364"} Mar 20 11:10:23 crc kubenswrapper[4860]: I0320 11:10:23.561684 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9j9j" event={"ID":"be396660-5e1e-4dfe-9a08-26b2fcc69a0a","Type":"ContainerStarted","Data":"7ba86a3ba39138c4bcf143a2055d72aa4da66a6fe5dfcdc6f44ec0ae82cefec5"} Mar 20 11:10:24 crc kubenswrapper[4860]: I0320 11:10:24.577216 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9j9j" event={"ID":"be396660-5e1e-4dfe-9a08-26b2fcc69a0a","Type":"ContainerStarted","Data":"e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd"} Mar 20 11:10:24 crc kubenswrapper[4860]: I0320 11:10:24.580027 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb4de49a-fca6-4c8c-8484-461859f95884" containerID="2733325d085ecbc581f2fe5823956c910410e5a3d79cfb6bd9c2dc10591bf08e" exitCode=0 Mar 20 11:10:24 crc kubenswrapper[4860]: I0320 11:10:24.580079 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" event={"ID":"eb4de49a-fca6-4c8c-8484-461859f95884","Type":"ContainerDied","Data":"2733325d085ecbc581f2fe5823956c910410e5a3d79cfb6bd9c2dc10591bf08e"} Mar 20 11:10:25 crc kubenswrapper[4860]: I0320 11:10:25.588582 4860 generic.go:334] "Generic (PLEG): container finished" podID="eb4de49a-fca6-4c8c-8484-461859f95884" containerID="c2c4cd967c16c24218315f5eedf406be6b1dec547a49ac6f6d3c445bafe22901" exitCode=0 Mar 20 11:10:25 crc kubenswrapper[4860]: I0320 11:10:25.588803 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" event={"ID":"eb4de49a-fca6-4c8c-8484-461859f95884","Type":"ContainerDied","Data":"c2c4cd967c16c24218315f5eedf406be6b1dec547a49ac6f6d3c445bafe22901"} Mar 20 11:10:26 crc kubenswrapper[4860]: I0320 11:10:26.598525 4860 generic.go:334] "Generic (PLEG): container finished" podID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerID="e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd" exitCode=0 Mar 20 11:10:26 crc kubenswrapper[4860]: I0320 11:10:26.598597 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9j9j" event={"ID":"be396660-5e1e-4dfe-9a08-26b2fcc69a0a","Type":"ContainerDied","Data":"e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd"} Mar 20 11:10:26 crc kubenswrapper[4860]: I0320 11:10:26.946697 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.064660 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-util\") pod \"eb4de49a-fca6-4c8c-8484-461859f95884\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.064841 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-bundle\") pod \"eb4de49a-fca6-4c8c-8484-461859f95884\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.064900 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhxht\" (UniqueName: \"kubernetes.io/projected/eb4de49a-fca6-4c8c-8484-461859f95884-kube-api-access-xhxht\") pod \"eb4de49a-fca6-4c8c-8484-461859f95884\" (UID: \"eb4de49a-fca6-4c8c-8484-461859f95884\") " Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.065663 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-bundle" (OuterVolumeSpecName: "bundle") pod "eb4de49a-fca6-4c8c-8484-461859f95884" (UID: "eb4de49a-fca6-4c8c-8484-461859f95884"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.072349 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb4de49a-fca6-4c8c-8484-461859f95884-kube-api-access-xhxht" (OuterVolumeSpecName: "kube-api-access-xhxht") pod "eb4de49a-fca6-4c8c-8484-461859f95884" (UID: "eb4de49a-fca6-4c8c-8484-461859f95884"). InnerVolumeSpecName "kube-api-access-xhxht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.075605 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-util" (OuterVolumeSpecName: "util") pod "eb4de49a-fca6-4c8c-8484-461859f95884" (UID: "eb4de49a-fca6-4c8c-8484-461859f95884"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.165931 4860 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.165973 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhxht\" (UniqueName: \"kubernetes.io/projected/eb4de49a-fca6-4c8c-8484-461859f95884-kube-api-access-xhxht\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.165993 4860 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb4de49a-fca6-4c8c-8484-461859f95884-util\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.607326 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" event={"ID":"eb4de49a-fca6-4c8c-8484-461859f95884","Type":"ContainerDied","Data":"ec54a7908fbfc0ba4a1a04b89bb324101104b953442cebe94810ba54c04ca6dd"} Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.607721 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec54a7908fbfc0ba4a1a04b89bb324101104b953442cebe94810ba54c04ca6dd" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.607341 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd" Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.611708 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9j9j" event={"ID":"be396660-5e1e-4dfe-9a08-26b2fcc69a0a","Type":"ContainerStarted","Data":"cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760"} Mar 20 11:10:27 crc kubenswrapper[4860]: I0320 11:10:27.630213 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b9j9j" podStartSLOduration=2.115647349 podStartE2EDuration="5.630182772s" podCreationTimestamp="2026-03-20 11:10:22 +0000 UTC" firstStartedPulling="2026-03-20 11:10:23.562805017 +0000 UTC m=+947.784165915" lastFinishedPulling="2026-03-20 11:10:27.07734044 +0000 UTC m=+951.298701338" observedRunningTime="2026-03-20 11:10:27.629620517 +0000 UTC m=+951.850981415" watchObservedRunningTime="2026-03-20 11:10:27.630182772 +0000 UTC m=+951.851543690" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.046108 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-dmczs"] Mar 20 11:10:31 crc kubenswrapper[4860]: E0320 11:10:31.046917 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de49a-fca6-4c8c-8484-461859f95884" containerName="util" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.046937 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de49a-fca6-4c8c-8484-461859f95884" containerName="util" Mar 20 11:10:31 crc kubenswrapper[4860]: E0320 11:10:31.046959 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de49a-fca6-4c8c-8484-461859f95884" containerName="pull" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.046969 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de49a-fca6-4c8c-8484-461859f95884" containerName="pull" Mar 20 11:10:31 crc kubenswrapper[4860]: E0320 11:10:31.046982 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de49a-fca6-4c8c-8484-461859f95884" containerName="extract" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.046993 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de49a-fca6-4c8c-8484-461859f95884" containerName="extract" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.047135 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4de49a-fca6-4c8c-8484-461859f95884" containerName="extract" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.047715 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.050512 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.050964 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-c67w4" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.051283 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.068530 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-dmczs"] Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.122511 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rvjp\" (UniqueName: \"kubernetes.io/projected/ce7d9f29-28cd-4038-b492-b18e0b129907-kube-api-access-4rvjp\") pod \"nmstate-operator-796d4cfff4-dmczs\" (UID: \"ce7d9f29-28cd-4038-b492-b18e0b129907\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.223653 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rvjp\" (UniqueName: \"kubernetes.io/projected/ce7d9f29-28cd-4038-b492-b18e0b129907-kube-api-access-4rvjp\") pod \"nmstate-operator-796d4cfff4-dmczs\" (UID: \"ce7d9f29-28cd-4038-b492-b18e0b129907\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.244626 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rvjp\" (UniqueName: \"kubernetes.io/projected/ce7d9f29-28cd-4038-b492-b18e0b129907-kube-api-access-4rvjp\") pod \"nmstate-operator-796d4cfff4-dmczs\" (UID: \"ce7d9f29-28cd-4038-b492-b18e0b129907\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.367332 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.616549 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-dmczs"] Mar 20 11:10:31 crc kubenswrapper[4860]: I0320 11:10:31.651332 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" event={"ID":"ce7d9f29-28cd-4038-b492-b18e0b129907","Type":"ContainerStarted","Data":"3b92c0467e344677f1da4a7e33230a9b30dc7ba817ecf222a202b4656bb4d20f"} Mar 20 11:10:33 crc kubenswrapper[4860]: I0320 11:10:33.057128 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:33 crc kubenswrapper[4860]: I0320 11:10:33.057923 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:34 crc kubenswrapper[4860]: I0320 11:10:34.101939 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b9j9j" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="registry-server" probeResult="failure" output=< Mar 20 11:10:34 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 11:10:34 crc kubenswrapper[4860]: > Mar 20 11:10:34 crc kubenswrapper[4860]: I0320 11:10:34.673563 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" event={"ID":"ce7d9f29-28cd-4038-b492-b18e0b129907","Type":"ContainerStarted","Data":"085d319dc17f08845c93b80c65b00d0c34e8491a854672021fff83f2f6fa35f8"} Mar 20 11:10:34 crc kubenswrapper[4860]: I0320 11:10:34.692430 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dmczs" podStartSLOduration=0.91749681 podStartE2EDuration="3.692411405s" podCreationTimestamp="2026-03-20 11:10:31 +0000 UTC" firstStartedPulling="2026-03-20 11:10:31.621325994 +0000 UTC m=+955.842686892" lastFinishedPulling="2026-03-20 11:10:34.396240589 +0000 UTC m=+958.617601487" observedRunningTime="2026-03-20 11:10:34.691504861 +0000 UTC m=+958.912865759" watchObservedRunningTime="2026-03-20 11:10:34.692411405 +0000 UTC m=+958.913772303" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.622839 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc"] Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.625073 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.627586 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-rwxzd" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.640838 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc"] Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.647468 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv"] Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.648546 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.661803 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2qvg\" (UniqueName: \"kubernetes.io/projected/6f56c0b5-3d27-49e6-af5b-6ad929d9e857-kube-api-access-k2qvg\") pod \"nmstate-metrics-9b8c8685d-wr9vc\" (UID: \"6f56c0b5-3d27-49e6-af5b-6ad929d9e857\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.663517 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.680597 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv"] Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.690452 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-mdh82"] Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.691676 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.763240 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-ovs-socket\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.763627 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-nmstate-lock\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.763730 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-9cfpv\" (UID: \"db5d41a4-2808-4189-8c3e-e0730cdf1a4f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.763816 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhgpp\" (UniqueName: \"kubernetes.io/projected/ef7f3b63-3a7d-483b-95c1-32961dad6226-kube-api-access-xhgpp\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.763940 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2qvg\" (UniqueName: \"kubernetes.io/projected/6f56c0b5-3d27-49e6-af5b-6ad929d9e857-kube-api-access-k2qvg\") pod \"nmstate-metrics-9b8c8685d-wr9vc\" (UID: \"6f56c0b5-3d27-49e6-af5b-6ad929d9e857\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.764040 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-dbus-socket\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.764140 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2wd6\" (UniqueName: \"kubernetes.io/projected/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-kube-api-access-s2wd6\") pod \"nmstate-webhook-5f558f5558-9cfpv\" (UID: \"db5d41a4-2808-4189-8c3e-e0730cdf1a4f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.792493 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2qvg\" (UniqueName: \"kubernetes.io/projected/6f56c0b5-3d27-49e6-af5b-6ad929d9e857-kube-api-access-k2qvg\") pod \"nmstate-metrics-9b8c8685d-wr9vc\" (UID: \"6f56c0b5-3d27-49e6-af5b-6ad929d9e857\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.813382 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx"] Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.814243 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.816437 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-8qkr7" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.816941 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.823648 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.828783 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx"] Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.866912 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-nmstate-lock\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867026 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-9cfpv\" (UID: \"db5d41a4-2808-4189-8c3e-e0730cdf1a4f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867069 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhgpp\" (UniqueName: \"kubernetes.io/projected/ef7f3b63-3a7d-483b-95c1-32961dad6226-kube-api-access-xhgpp\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867142 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k6r2\" (UniqueName: \"kubernetes.io/projected/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-kube-api-access-2k6r2\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867174 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-dbus-socket\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867239 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2wd6\" (UniqueName: \"kubernetes.io/projected/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-kube-api-access-s2wd6\") pod \"nmstate-webhook-5f558f5558-9cfpv\" (UID: \"db5d41a4-2808-4189-8c3e-e0730cdf1a4f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867733 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-nmstate-lock\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: E0320 11:10:40.867789 4860 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867857 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:40 crc kubenswrapper[4860]: E0320 11:10:40.867892 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-tls-key-pair podName:db5d41a4-2808-4189-8c3e-e0730cdf1a4f nodeName:}" failed. No retries permitted until 2026-03-20 11:10:41.367870218 +0000 UTC m=+965.589231116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-tls-key-pair") pod "nmstate-webhook-5f558f5558-9cfpv" (UID: "db5d41a4-2808-4189-8c3e-e0730cdf1a4f") : secret "openshift-nmstate-webhook" not found Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867915 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.867960 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-ovs-socket\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.868139 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-ovs-socket\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.868540 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ef7f3b63-3a7d-483b-95c1-32961dad6226-dbus-socket\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.887024 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhgpp\" (UniqueName: \"kubernetes.io/projected/ef7f3b63-3a7d-483b-95c1-32961dad6226-kube-api-access-xhgpp\") pod \"nmstate-handler-mdh82\" (UID: \"ef7f3b63-3a7d-483b-95c1-32961dad6226\") " pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.893683 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2wd6\" (UniqueName: \"kubernetes.io/projected/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-kube-api-access-s2wd6\") pod \"nmstate-webhook-5f558f5558-9cfpv\" (UID: \"db5d41a4-2808-4189-8c3e-e0730cdf1a4f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.947848 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.968685 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k6r2\" (UniqueName: \"kubernetes.io/projected/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-kube-api-access-2k6r2\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.968749 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.968772 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:40 crc kubenswrapper[4860]: E0320 11:10:40.968968 4860 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 20 11:10:40 crc kubenswrapper[4860]: E0320 11:10:40.969034 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-plugin-serving-cert podName:a91c6f2b-7646-4f4d-bdc2-47304e36da4e nodeName:}" failed. No retries permitted until 2026-03-20 11:10:41.469015153 +0000 UTC m=+965.690376051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-l98lx" (UID: "a91c6f2b-7646-4f4d-bdc2-47304e36da4e") : secret "plugin-serving-cert" not found Mar 20 11:10:40 crc kubenswrapper[4860]: I0320 11:10:40.970482 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:40.999927 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k6r2\" (UniqueName: \"kubernetes.io/projected/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-kube-api-access-2k6r2\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.001513 4860 scope.go:117] "RemoveContainer" containerID="d842a5cd77f0d9f0965cbf10a0f92313f544e8649c8e3427de05d3a92939e32e" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.023964 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.032128 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-cc764bbd9-xmdhb"] Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.033129 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.053352 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cc764bbd9-xmdhb"] Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.065511 4860 scope.go:117] "RemoveContainer" containerID="e1d897530152cf8d1ddca69100f0ae29a4da57552de29a47aaed46aa70fa805e" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.070802 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-trusted-ca-bundle\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.071402 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-config\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.071431 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-oauth-config\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.071467 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-service-ca\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.071510 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz8zt\" (UniqueName: \"kubernetes.io/projected/42d7da76-e33e-46c9-a2ef-1715fb8e8500-kube-api-access-pz8zt\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.071560 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-oauth-serving-cert\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.071596 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-serving-cert\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.173338 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-oauth-serving-cert\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.173748 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-serving-cert\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.173780 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-trusted-ca-bundle\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.173809 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-config\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.173839 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-oauth-config\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.173893 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-service-ca\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.173937 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz8zt\" (UniqueName: \"kubernetes.io/projected/42d7da76-e33e-46c9-a2ef-1715fb8e8500-kube-api-access-pz8zt\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.175849 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-oauth-serving-cert\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.176407 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-config\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.177117 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-service-ca\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.177479 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42d7da76-e33e-46c9-a2ef-1715fb8e8500-trusted-ca-bundle\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.189092 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-serving-cert\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.192004 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42d7da76-e33e-46c9-a2ef-1715fb8e8500-console-oauth-config\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.201364 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz8zt\" (UniqueName: \"kubernetes.io/projected/42d7da76-e33e-46c9-a2ef-1715fb8e8500-kube-api-access-pz8zt\") pod \"console-cc764bbd9-xmdhb\" (UID: \"42d7da76-e33e-46c9-a2ef-1715fb8e8500\") " pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.293711 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc"] Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.375731 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.376242 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-9cfpv\" (UID: \"db5d41a4-2808-4189-8c3e-e0730cdf1a4f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.384177 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/db5d41a4-2808-4189-8c3e-e0730cdf1a4f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-9cfpv\" (UID: \"db5d41a4-2808-4189-8c3e-e0730cdf1a4f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.477656 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.483790 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a91c6f2b-7646-4f4d-bdc2-47304e36da4e-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-l98lx\" (UID: \"a91c6f2b-7646-4f4d-bdc2-47304e36da4e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.575972 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cc764bbd9-xmdhb"] Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.576415 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:41 crc kubenswrapper[4860]: W0320 11:10:41.584246 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42d7da76_e33e_46c9_a2ef_1715fb8e8500.slice/crio-5884b3549c8041244f1ecb77a9d8f4d57eefb8cb54e37132c324e9d17047044f WatchSource:0}: Error finding container 5884b3549c8041244f1ecb77a9d8f4d57eefb8cb54e37132c324e9d17047044f: Status 404 returned error can't find the container with id 5884b3549c8041244f1ecb77a9d8f4d57eefb8cb54e37132c324e9d17047044f Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.734843 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmc44_a89c8af2-338f-401f-aad5-c6d7763a3b3a/kube-multus/2.log" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.736524 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mdh82" event={"ID":"ef7f3b63-3a7d-483b-95c1-32961dad6226","Type":"ContainerStarted","Data":"a27495590bd1154ea3d0454792b6a5b090f14e5d389499e7f55b1cf8c57d0a90"} Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.738794 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cc764bbd9-xmdhb" event={"ID":"42d7da76-e33e-46c9-a2ef-1715fb8e8500","Type":"ContainerStarted","Data":"5884b3549c8041244f1ecb77a9d8f4d57eefb8cb54e37132c324e9d17047044f"} Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.739846 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" event={"ID":"6f56c0b5-3d27-49e6-af5b-6ad929d9e857","Type":"ContainerStarted","Data":"f74e0b7a674065bb9df6ebca3548534d5143eca7602fb403d78610a6263a7e79"} Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.757413 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" Mar 20 11:10:41 crc kubenswrapper[4860]: I0320 11:10:41.784567 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv"] Mar 20 11:10:41 crc kubenswrapper[4860]: W0320 11:10:41.837264 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb5d41a4_2808_4189_8c3e_e0730cdf1a4f.slice/crio-563dfec71651638ecb2dc9559abba0fdf6cc5772fd1139322b8938ff2790e36d WatchSource:0}: Error finding container 563dfec71651638ecb2dc9559abba0fdf6cc5772fd1139322b8938ff2790e36d: Status 404 returned error can't find the container with id 563dfec71651638ecb2dc9559abba0fdf6cc5772fd1139322b8938ff2790e36d Mar 20 11:10:42 crc kubenswrapper[4860]: I0320 11:10:42.014134 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx"] Mar 20 11:10:42 crc kubenswrapper[4860]: I0320 11:10:42.749962 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" event={"ID":"db5d41a4-2808-4189-8c3e-e0730cdf1a4f","Type":"ContainerStarted","Data":"563dfec71651638ecb2dc9559abba0fdf6cc5772fd1139322b8938ff2790e36d"} Mar 20 11:10:42 crc kubenswrapper[4860]: I0320 11:10:42.754607 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cc764bbd9-xmdhb" event={"ID":"42d7da76-e33e-46c9-a2ef-1715fb8e8500","Type":"ContainerStarted","Data":"adf76be9e6a417f21ea9010902bc5927911fceabcd94bbab61923ca33fb1801d"} Mar 20 11:10:42 crc kubenswrapper[4860]: I0320 11:10:42.756079 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" event={"ID":"a91c6f2b-7646-4f4d-bdc2-47304e36da4e","Type":"ContainerStarted","Data":"d30eceaa7d60c69131cb3e850cb2ea6ba9d5379a363fc20c5a6c5894895c4402"} Mar 20 11:10:42 crc kubenswrapper[4860]: I0320 11:10:42.779816 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cc764bbd9-xmdhb" podStartSLOduration=1.779786933 podStartE2EDuration="1.779786933s" podCreationTimestamp="2026-03-20 11:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:10:42.775369272 +0000 UTC m=+966.996730190" watchObservedRunningTime="2026-03-20 11:10:42.779786933 +0000 UTC m=+967.001147831" Mar 20 11:10:43 crc kubenswrapper[4860]: I0320 11:10:43.144050 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:43 crc kubenswrapper[4860]: I0320 11:10:43.222466 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:43 crc kubenswrapper[4860]: I0320 11:10:43.383085 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b9j9j"] Mar 20 11:10:44 crc kubenswrapper[4860]: I0320 11:10:44.791011 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" event={"ID":"6f56c0b5-3d27-49e6-af5b-6ad929d9e857","Type":"ContainerStarted","Data":"bebfa7562c61cae81e8ea8e2b71c6a147d1fcfc6bb35c09497bd07de9d2ae333"} Mar 20 11:10:44 crc kubenswrapper[4860]: I0320 11:10:44.795218 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b9j9j" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="registry-server" containerID="cri-o://cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760" gracePeriod=2 Mar 20 11:10:44 crc kubenswrapper[4860]: I0320 11:10:44.796206 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" event={"ID":"db5d41a4-2808-4189-8c3e-e0730cdf1a4f","Type":"ContainerStarted","Data":"22a8f73c83319d6ecc9355fa4c7d585428493a897367a07354f1f2674beb3e72"} Mar 20 11:10:44 crc kubenswrapper[4860]: I0320 11:10:44.796247 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:10:44 crc kubenswrapper[4860]: I0320 11:10:44.826675 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" podStartSLOduration=2.115690119 podStartE2EDuration="4.826653826s" podCreationTimestamp="2026-03-20 11:10:40 +0000 UTC" firstStartedPulling="2026-03-20 11:10:41.840334602 +0000 UTC m=+966.061695500" lastFinishedPulling="2026-03-20 11:10:44.551298299 +0000 UTC m=+968.772659207" observedRunningTime="2026-03-20 11:10:44.824396794 +0000 UTC m=+969.045757692" watchObservedRunningTime="2026-03-20 11:10:44.826653826 +0000 UTC m=+969.048014724" Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.153281 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.346512 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7tgs\" (UniqueName: \"kubernetes.io/projected/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-kube-api-access-q7tgs\") pod \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.346640 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-catalog-content\") pod \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.346751 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-utilities\") pod \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\" (UID: \"be396660-5e1e-4dfe-9a08-26b2fcc69a0a\") " Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.348163 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-utilities" (OuterVolumeSpecName: "utilities") pod "be396660-5e1e-4dfe-9a08-26b2fcc69a0a" (UID: "be396660-5e1e-4dfe-9a08-26b2fcc69a0a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.354169 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-kube-api-access-q7tgs" (OuterVolumeSpecName: "kube-api-access-q7tgs") pod "be396660-5e1e-4dfe-9a08-26b2fcc69a0a" (UID: "be396660-5e1e-4dfe-9a08-26b2fcc69a0a"). InnerVolumeSpecName "kube-api-access-q7tgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.450744 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7tgs\" (UniqueName: \"kubernetes.io/projected/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-kube-api-access-q7tgs\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.450789 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.484675 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be396660-5e1e-4dfe-9a08-26b2fcc69a0a" (UID: "be396660-5e1e-4dfe-9a08-26b2fcc69a0a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:10:45 crc kubenswrapper[4860]: I0320 11:10:45.552190 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be396660-5e1e-4dfe-9a08-26b2fcc69a0a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.468896 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mdh82" event={"ID":"ef7f3b63-3a7d-483b-95c1-32961dad6226","Type":"ContainerStarted","Data":"8344bd2879de4da2b9ad33341da7217357530e39b5cb4b8649c24afd700d6f1a"} Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.468987 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.477922 4860 generic.go:334] "Generic (PLEG): container finished" podID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerID="cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760" exitCode=0 Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.478010 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9j9j" event={"ID":"be396660-5e1e-4dfe-9a08-26b2fcc69a0a","Type":"ContainerDied","Data":"cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760"} Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.480878 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9j9j" event={"ID":"be396660-5e1e-4dfe-9a08-26b2fcc69a0a","Type":"ContainerDied","Data":"7ba86a3ba39138c4bcf143a2055d72aa4da66a6fe5dfcdc6f44ec0ae82cefec5"} Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.478060 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9j9j" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.480915 4860 scope.go:117] "RemoveContainer" containerID="cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.489958 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-mdh82" podStartSLOduration=3.026331702 podStartE2EDuration="6.489882222s" podCreationTimestamp="2026-03-20 11:10:40 +0000 UTC" firstStartedPulling="2026-03-20 11:10:41.096346814 +0000 UTC m=+965.317707712" lastFinishedPulling="2026-03-20 11:10:44.559897334 +0000 UTC m=+968.781258232" observedRunningTime="2026-03-20 11:10:46.48686421 +0000 UTC m=+970.708225108" watchObservedRunningTime="2026-03-20 11:10:46.489882222 +0000 UTC m=+970.711243120" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.526973 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b9j9j"] Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.532042 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b9j9j"] Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.559214 4860 scope.go:117] "RemoveContainer" containerID="e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.589046 4860 scope.go:117] "RemoveContainer" containerID="6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.730342 4860 scope.go:117] "RemoveContainer" containerID="cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760" Mar 20 11:10:46 crc kubenswrapper[4860]: E0320 11:10:46.731004 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760\": container with ID starting with cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760 not found: ID does not exist" containerID="cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.731074 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760"} err="failed to get container status \"cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760\": rpc error: code = NotFound desc = could not find container \"cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760\": container with ID starting with cd3a05044bfe778d42dcd060ef5d691bbbe313f7cf0b5e7dfb5bc41edbe2c760 not found: ID does not exist" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.731117 4860 scope.go:117] "RemoveContainer" containerID="e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd" Mar 20 11:10:46 crc kubenswrapper[4860]: E0320 11:10:46.731632 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd\": container with ID starting with e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd not found: ID does not exist" containerID="e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.731694 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd"} err="failed to get container status \"e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd\": rpc error: code = NotFound desc = could not find container \"e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd\": container with ID starting with e64b2de3852cba02ad07be7827131798bd5071b683d1f700c0238cb432da1cfd not found: ID does not exist" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.731741 4860 scope.go:117] "RemoveContainer" containerID="6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364" Mar 20 11:10:46 crc kubenswrapper[4860]: E0320 11:10:46.732165 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364\": container with ID starting with 6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364 not found: ID does not exist" containerID="6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364" Mar 20 11:10:46 crc kubenswrapper[4860]: I0320 11:10:46.732189 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364"} err="failed to get container status \"6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364\": rpc error: code = NotFound desc = could not find container \"6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364\": container with ID starting with 6175a4aeb09e5e4ddb845b3a23c481aab432f44fb75f04e279bd3991e69ab364 not found: ID does not exist" Mar 20 11:10:47 crc kubenswrapper[4860]: I0320 11:10:47.427737 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" path="/var/lib/kubelet/pods/be396660-5e1e-4dfe-9a08-26b2fcc69a0a/volumes" Mar 20 11:10:47 crc kubenswrapper[4860]: I0320 11:10:47.493788 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" event={"ID":"a91c6f2b-7646-4f4d-bdc2-47304e36da4e","Type":"ContainerStarted","Data":"e0069371eb676eef23776e18e272defe4de88da747fe9fd9340cf99f976033ac"} Mar 20 11:10:47 crc kubenswrapper[4860]: I0320 11:10:47.515625 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l98lx" podStartSLOduration=2.788907831 podStartE2EDuration="7.51558994s" podCreationTimestamp="2026-03-20 11:10:40 +0000 UTC" firstStartedPulling="2026-03-20 11:10:42.02646922 +0000 UTC m=+966.247830118" lastFinishedPulling="2026-03-20 11:10:46.753151329 +0000 UTC m=+970.974512227" observedRunningTime="2026-03-20 11:10:47.510925813 +0000 UTC m=+971.732286731" watchObservedRunningTime="2026-03-20 11:10:47.51558994 +0000 UTC m=+971.736950838" Mar 20 11:10:48 crc kubenswrapper[4860]: I0320 11:10:48.503747 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" event={"ID":"6f56c0b5-3d27-49e6-af5b-6ad929d9e857","Type":"ContainerStarted","Data":"54ba53c2a41cd712f4f8515e69ca2d5dd8e9b98b94af121599a9b32c107d5efb"} Mar 20 11:10:48 crc kubenswrapper[4860]: I0320 11:10:48.530505 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wr9vc" podStartSLOduration=1.9662549139999999 podStartE2EDuration="8.530479038s" podCreationTimestamp="2026-03-20 11:10:40 +0000 UTC" firstStartedPulling="2026-03-20 11:10:41.31091673 +0000 UTC m=+965.532277628" lastFinishedPulling="2026-03-20 11:10:47.875140854 +0000 UTC m=+972.096501752" observedRunningTime="2026-03-20 11:10:48.524815335 +0000 UTC m=+972.746176233" watchObservedRunningTime="2026-03-20 11:10:48.530479038 +0000 UTC m=+972.751839936" Mar 20 11:10:51 crc kubenswrapper[4860]: I0320 11:10:51.053620 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-mdh82" Mar 20 11:10:51 crc kubenswrapper[4860]: I0320 11:10:51.376763 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:51 crc kubenswrapper[4860]: I0320 11:10:51.376842 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:51 crc kubenswrapper[4860]: I0320 11:10:51.383637 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:51 crc kubenswrapper[4860]: I0320 11:10:51.527356 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cc764bbd9-xmdhb" Mar 20 11:10:51 crc kubenswrapper[4860]: I0320 11:10:51.585078 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-sqrz5"] Mar 20 11:10:52 crc kubenswrapper[4860]: I0320 11:10:52.344535 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:10:52 crc kubenswrapper[4860]: I0320 11:10:52.345080 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:11:01 crc kubenswrapper[4860]: I0320 11:11:01.582356 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9cfpv" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.864918 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn"] Mar 20 11:11:14 crc kubenswrapper[4860]: E0320 11:11:14.866099 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="registry-server" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.866116 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="registry-server" Mar 20 11:11:14 crc kubenswrapper[4860]: E0320 11:11:14.866139 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="extract-utilities" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.866146 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="extract-utilities" Mar 20 11:11:14 crc kubenswrapper[4860]: E0320 11:11:14.866155 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="extract-content" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.866161 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="extract-content" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.866316 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="be396660-5e1e-4dfe-9a08-26b2fcc69a0a" containerName="registry-server" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.867309 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.872150 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 11:11:14 crc kubenswrapper[4860]: I0320 11:11:14.877759 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn"] Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.002750 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.002807 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.002955 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6frwf\" (UniqueName: \"kubernetes.io/projected/7da0294e-5ac5-4655-b882-cfd1f36ce791-kube-api-access-6frwf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.104577 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6frwf\" (UniqueName: \"kubernetes.io/projected/7da0294e-5ac5-4655-b882-cfd1f36ce791-kube-api-access-6frwf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.104672 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.104699 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.105365 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.105420 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.126218 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6frwf\" (UniqueName: \"kubernetes.io/projected/7da0294e-5ac5-4655-b882-cfd1f36ce791-kube-api-access-6frwf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.194216 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.608855 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn"] Mar 20 11:11:15 crc kubenswrapper[4860]: I0320 11:11:15.701435 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" event={"ID":"7da0294e-5ac5-4655-b882-cfd1f36ce791","Type":"ContainerStarted","Data":"6605f719a5d599cbc168598e36bac59f42a20935c2dd222c9a8f61b2a7ff744b"} Mar 20 11:11:16 crc kubenswrapper[4860]: I0320 11:11:16.631269 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-sqrz5" podUID="e8ca532e-b0d7-494c-886f-bff0c8009707" containerName="console" containerID="cri-o://589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d" gracePeriod=15 Mar 20 11:11:16 crc kubenswrapper[4860]: I0320 11:11:16.710627 4860 generic.go:334] "Generic (PLEG): container finished" podID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerID="22d182dcb02a0ca2445f79c27a70decf1dd99ec7371a15142ed192e16a2e8fa2" exitCode=0 Mar 20 11:11:16 crc kubenswrapper[4860]: I0320 11:11:16.710684 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" event={"ID":"7da0294e-5ac5-4655-b882-cfd1f36ce791","Type":"ContainerDied","Data":"22d182dcb02a0ca2445f79c27a70decf1dd99ec7371a15142ed192e16a2e8fa2"} Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.016398 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-sqrz5_e8ca532e-b0d7-494c-886f-bff0c8009707/console/0.log" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.016503 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.137428 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-console-config\") pod \"e8ca532e-b0d7-494c-886f-bff0c8009707\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.137571 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-trusted-ca-bundle\") pod \"e8ca532e-b0d7-494c-886f-bff0c8009707\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.137610 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-service-ca\") pod \"e8ca532e-b0d7-494c-886f-bff0c8009707\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.137662 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk8bz\" (UniqueName: \"kubernetes.io/projected/e8ca532e-b0d7-494c-886f-bff0c8009707-kube-api-access-lk8bz\") pod \"e8ca532e-b0d7-494c-886f-bff0c8009707\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.137796 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-serving-cert\") pod \"e8ca532e-b0d7-494c-886f-bff0c8009707\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.137846 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-oauth-serving-cert\") pod \"e8ca532e-b0d7-494c-886f-bff0c8009707\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.137872 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-oauth-config\") pod \"e8ca532e-b0d7-494c-886f-bff0c8009707\" (UID: \"e8ca532e-b0d7-494c-886f-bff0c8009707\") " Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.138256 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-console-config" (OuterVolumeSpecName: "console-config") pod "e8ca532e-b0d7-494c-886f-bff0c8009707" (UID: "e8ca532e-b0d7-494c-886f-bff0c8009707"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.138315 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e8ca532e-b0d7-494c-886f-bff0c8009707" (UID: "e8ca532e-b0d7-494c-886f-bff0c8009707"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.138727 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-service-ca" (OuterVolumeSpecName: "service-ca") pod "e8ca532e-b0d7-494c-886f-bff0c8009707" (UID: "e8ca532e-b0d7-494c-886f-bff0c8009707"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.139154 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e8ca532e-b0d7-494c-886f-bff0c8009707" (UID: "e8ca532e-b0d7-494c-886f-bff0c8009707"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.144742 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e8ca532e-b0d7-494c-886f-bff0c8009707" (UID: "e8ca532e-b0d7-494c-886f-bff0c8009707"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.145803 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e8ca532e-b0d7-494c-886f-bff0c8009707" (UID: "e8ca532e-b0d7-494c-886f-bff0c8009707"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.147079 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ca532e-b0d7-494c-886f-bff0c8009707-kube-api-access-lk8bz" (OuterVolumeSpecName: "kube-api-access-lk8bz") pod "e8ca532e-b0d7-494c-886f-bff0c8009707" (UID: "e8ca532e-b0d7-494c-886f-bff0c8009707"). InnerVolumeSpecName "kube-api-access-lk8bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.239355 4860 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.239402 4860 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.239411 4860 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ca532e-b0d7-494c-886f-bff0c8009707-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.239424 4860 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.239443 4860 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.239476 4860 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ca532e-b0d7-494c-886f-bff0c8009707-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.239488 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk8bz\" (UniqueName: \"kubernetes.io/projected/e8ca532e-b0d7-494c-886f-bff0c8009707-kube-api-access-lk8bz\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.718672 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-sqrz5_e8ca532e-b0d7-494c-886f-bff0c8009707/console/0.log" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.718740 4860 generic.go:334] "Generic (PLEG): container finished" podID="e8ca532e-b0d7-494c-886f-bff0c8009707" containerID="589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d" exitCode=2 Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.718779 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sqrz5" event={"ID":"e8ca532e-b0d7-494c-886f-bff0c8009707","Type":"ContainerDied","Data":"589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d"} Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.718809 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sqrz5" event={"ID":"e8ca532e-b0d7-494c-886f-bff0c8009707","Type":"ContainerDied","Data":"72e1e1c0612e639b5d9b1dd93371fee28768245c503b21f6343128336d8f4145"} Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.718831 4860 scope.go:117] "RemoveContainer" containerID="589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.719038 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sqrz5" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.742944 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-sqrz5"] Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.748039 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-sqrz5"] Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.748295 4860 scope.go:117] "RemoveContainer" containerID="589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d" Mar 20 11:11:17 crc kubenswrapper[4860]: E0320 11:11:17.749186 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d\": container with ID starting with 589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d not found: ID does not exist" containerID="589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d" Mar 20 11:11:17 crc kubenswrapper[4860]: I0320 11:11:17.749216 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d"} err="failed to get container status \"589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d\": rpc error: code = NotFound desc = could not find container \"589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d\": container with ID starting with 589d18b62b1f0e4971b0fb3e2fc61341927ed65c1c47ad52b5aa882f952f550d not found: ID does not exist" Mar 20 11:11:18 crc kubenswrapper[4860]: I0320 11:11:18.730251 4860 generic.go:334] "Generic (PLEG): container finished" podID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerID="5c8185a2c1b9ec4fbf156815fc31a2380dc1abffd3d79fed1741875444c47732" exitCode=0 Mar 20 11:11:18 crc kubenswrapper[4860]: I0320 11:11:18.730298 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" event={"ID":"7da0294e-5ac5-4655-b882-cfd1f36ce791","Type":"ContainerDied","Data":"5c8185a2c1b9ec4fbf156815fc31a2380dc1abffd3d79fed1741875444c47732"} Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.421852 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8ca532e-b0d7-494c-886f-bff0c8009707" path="/var/lib/kubelet/pods/e8ca532e-b0d7-494c-886f-bff0c8009707/volumes" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.622806 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nskpj"] Mar 20 11:11:19 crc kubenswrapper[4860]: E0320 11:11:19.623129 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ca532e-b0d7-494c-886f-bff0c8009707" containerName="console" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.623145 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ca532e-b0d7-494c-886f-bff0c8009707" containerName="console" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.623324 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ca532e-b0d7-494c-886f-bff0c8009707" containerName="console" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.624715 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.634540 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nskpj"] Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.741029 4860 generic.go:334] "Generic (PLEG): container finished" podID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerID="0a9e0ba5e1736f05b3ed3f26a2231469f8608a5402919e94471951eed9b9dfc6" exitCode=0 Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.741087 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" event={"ID":"7da0294e-5ac5-4655-b882-cfd1f36ce791","Type":"ContainerDied","Data":"0a9e0ba5e1736f05b3ed3f26a2231469f8608a5402919e94471951eed9b9dfc6"} Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.780717 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-utilities\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.780805 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngdgx\" (UniqueName: \"kubernetes.io/projected/659f81bd-8f32-4dd2-9a63-6b2665fa6647-kube-api-access-ngdgx\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.780855 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-catalog-content\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.881817 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-catalog-content\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.881906 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-utilities\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.881950 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngdgx\" (UniqueName: \"kubernetes.io/projected/659f81bd-8f32-4dd2-9a63-6b2665fa6647-kube-api-access-ngdgx\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.882509 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-catalog-content\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.883214 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-utilities\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.905129 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngdgx\" (UniqueName: \"kubernetes.io/projected/659f81bd-8f32-4dd2-9a63-6b2665fa6647-kube-api-access-ngdgx\") pod \"redhat-marketplace-nskpj\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:19 crc kubenswrapper[4860]: I0320 11:11:19.950678 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:20 crc kubenswrapper[4860]: I0320 11:11:20.189260 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nskpj"] Mar 20 11:11:20 crc kubenswrapper[4860]: I0320 11:11:20.747959 4860 generic.go:334] "Generic (PLEG): container finished" podID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerID="a3af29e4cd182fe4ad8f9d7ac048c0de3c19c12bb591945ad5bc082f0dae2c7c" exitCode=0 Mar 20 11:11:20 crc kubenswrapper[4860]: I0320 11:11:20.748053 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nskpj" event={"ID":"659f81bd-8f32-4dd2-9a63-6b2665fa6647","Type":"ContainerDied","Data":"a3af29e4cd182fe4ad8f9d7ac048c0de3c19c12bb591945ad5bc082f0dae2c7c"} Mar 20 11:11:20 crc kubenswrapper[4860]: I0320 11:11:20.748436 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nskpj" event={"ID":"659f81bd-8f32-4dd2-9a63-6b2665fa6647","Type":"ContainerStarted","Data":"41dfd9902a89b0dfeabbbcc8c55d12c6c1350b3b1f92b4a588be3bf82690038f"} Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.017849 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jwpk7"] Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.019286 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.025883 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.052979 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwpk7"] Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.100062 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-catalog-content\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.100132 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-utilities\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.100173 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jkn\" (UniqueName: \"kubernetes.io/projected/c65a3188-9f0a-4525-b876-635822d8dda4-kube-api-access-d6jkn\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.201451 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-util\") pod \"7da0294e-5ac5-4655-b882-cfd1f36ce791\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.201556 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-bundle\") pod \"7da0294e-5ac5-4655-b882-cfd1f36ce791\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.201600 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6frwf\" (UniqueName: \"kubernetes.io/projected/7da0294e-5ac5-4655-b882-cfd1f36ce791-kube-api-access-6frwf\") pod \"7da0294e-5ac5-4655-b882-cfd1f36ce791\" (UID: \"7da0294e-5ac5-4655-b882-cfd1f36ce791\") " Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.201873 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-catalog-content\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.201911 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-utilities\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.201946 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6jkn\" (UniqueName: \"kubernetes.io/projected/c65a3188-9f0a-4525-b876-635822d8dda4-kube-api-access-d6jkn\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.202593 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-utilities\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.202777 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-catalog-content\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.203199 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-bundle" (OuterVolumeSpecName: "bundle") pod "7da0294e-5ac5-4655-b882-cfd1f36ce791" (UID: "7da0294e-5ac5-4655-b882-cfd1f36ce791"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.217353 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7da0294e-5ac5-4655-b882-cfd1f36ce791-kube-api-access-6frwf" (OuterVolumeSpecName: "kube-api-access-6frwf") pod "7da0294e-5ac5-4655-b882-cfd1f36ce791" (UID: "7da0294e-5ac5-4655-b882-cfd1f36ce791"). InnerVolumeSpecName "kube-api-access-6frwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.229142 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6jkn\" (UniqueName: \"kubernetes.io/projected/c65a3188-9f0a-4525-b876-635822d8dda4-kube-api-access-d6jkn\") pod \"certified-operators-jwpk7\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.304286 4860 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.304349 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6frwf\" (UniqueName: \"kubernetes.io/projected/7da0294e-5ac5-4655-b882-cfd1f36ce791-kube-api-access-6frwf\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.348195 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-util" (OuterVolumeSpecName: "util") pod "7da0294e-5ac5-4655-b882-cfd1f36ce791" (UID: "7da0294e-5ac5-4655-b882-cfd1f36ce791"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.361096 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.406066 4860 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7da0294e-5ac5-4655-b882-cfd1f36ce791-util\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.626478 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwpk7"] Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.761336 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nskpj" event={"ID":"659f81bd-8f32-4dd2-9a63-6b2665fa6647","Type":"ContainerStarted","Data":"c6814f68bcca89f3e902966194b072801e04bd2ac54dbd226c91e59deb878610"} Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.763810 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwpk7" event={"ID":"c65a3188-9f0a-4525-b876-635822d8dda4","Type":"ContainerStarted","Data":"e6307789218fa3c6a03a6f158c301aa7e53133c1765dba4abc69e40da0ce5696"} Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.766190 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" event={"ID":"7da0294e-5ac5-4655-b882-cfd1f36ce791","Type":"ContainerDied","Data":"6605f719a5d599cbc168598e36bac59f42a20935c2dd222c9a8f61b2a7ff744b"} Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.766288 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6605f719a5d599cbc168598e36bac59f42a20935c2dd222c9a8f61b2a7ff744b" Mar 20 11:11:21 crc kubenswrapper[4860]: I0320 11:11:21.766294 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn" Mar 20 11:11:22 crc kubenswrapper[4860]: I0320 11:11:22.344633 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:11:22 crc kubenswrapper[4860]: I0320 11:11:22.344726 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:11:22 crc kubenswrapper[4860]: I0320 11:11:22.775946 4860 generic.go:334] "Generic (PLEG): container finished" podID="c65a3188-9f0a-4525-b876-635822d8dda4" containerID="e14537304077b317ad670f8e2115905292466a260253edc43ac45ce35e816ef6" exitCode=0 Mar 20 11:11:22 crc kubenswrapper[4860]: I0320 11:11:22.776045 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwpk7" event={"ID":"c65a3188-9f0a-4525-b876-635822d8dda4","Type":"ContainerDied","Data":"e14537304077b317ad670f8e2115905292466a260253edc43ac45ce35e816ef6"} Mar 20 11:11:22 crc kubenswrapper[4860]: I0320 11:11:22.779141 4860 generic.go:334] "Generic (PLEG): container finished" podID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerID="c6814f68bcca89f3e902966194b072801e04bd2ac54dbd226c91e59deb878610" exitCode=0 Mar 20 11:11:22 crc kubenswrapper[4860]: I0320 11:11:22.779200 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nskpj" event={"ID":"659f81bd-8f32-4dd2-9a63-6b2665fa6647","Type":"ContainerDied","Data":"c6814f68bcca89f3e902966194b072801e04bd2ac54dbd226c91e59deb878610"} Mar 20 11:11:23 crc kubenswrapper[4860]: I0320 11:11:23.787900 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwpk7" event={"ID":"c65a3188-9f0a-4525-b876-635822d8dda4","Type":"ContainerStarted","Data":"e68be3bab36d85e5b37688fc943817973a421b95bb420a49b7aa9a1cf465b12b"} Mar 20 11:11:23 crc kubenswrapper[4860]: I0320 11:11:23.791024 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nskpj" event={"ID":"659f81bd-8f32-4dd2-9a63-6b2665fa6647","Type":"ContainerStarted","Data":"42bad4c11bf889acf02f0318595f240e854802661f3002d50e2d7621ab40266d"} Mar 20 11:11:24 crc kubenswrapper[4860]: I0320 11:11:24.800726 4860 generic.go:334] "Generic (PLEG): container finished" podID="c65a3188-9f0a-4525-b876-635822d8dda4" containerID="e68be3bab36d85e5b37688fc943817973a421b95bb420a49b7aa9a1cf465b12b" exitCode=0 Mar 20 11:11:24 crc kubenswrapper[4860]: I0320 11:11:24.800813 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwpk7" event={"ID":"c65a3188-9f0a-4525-b876-635822d8dda4","Type":"ContainerDied","Data":"e68be3bab36d85e5b37688fc943817973a421b95bb420a49b7aa9a1cf465b12b"} Mar 20 11:11:24 crc kubenswrapper[4860]: I0320 11:11:24.825350 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nskpj" podStartSLOduration=3.088717664 podStartE2EDuration="5.825324235s" podCreationTimestamp="2026-03-20 11:11:19 +0000 UTC" firstStartedPulling="2026-03-20 11:11:20.750537964 +0000 UTC m=+1004.971898862" lastFinishedPulling="2026-03-20 11:11:23.487144535 +0000 UTC m=+1007.708505433" observedRunningTime="2026-03-20 11:11:23.833484581 +0000 UTC m=+1008.054845479" watchObservedRunningTime="2026-03-20 11:11:24.825324235 +0000 UTC m=+1009.046685133" Mar 20 11:11:25 crc kubenswrapper[4860]: I0320 11:11:25.812409 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwpk7" event={"ID":"c65a3188-9f0a-4525-b876-635822d8dda4","Type":"ContainerStarted","Data":"fcf22871cd5e74e6fe79c7e6a0618d042337435567daae70e294425c235cfefb"} Mar 20 11:11:25 crc kubenswrapper[4860]: I0320 11:11:25.848318 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jwpk7" podStartSLOduration=3.37822803 podStartE2EDuration="5.848290928s" podCreationTimestamp="2026-03-20 11:11:20 +0000 UTC" firstStartedPulling="2026-03-20 11:11:22.778154065 +0000 UTC m=+1006.999514963" lastFinishedPulling="2026-03-20 11:11:25.248216963 +0000 UTC m=+1009.469577861" observedRunningTime="2026-03-20 11:11:25.843705944 +0000 UTC m=+1010.065066852" watchObservedRunningTime="2026-03-20 11:11:25.848290928 +0000 UTC m=+1010.069651826" Mar 20 11:11:29 crc kubenswrapper[4860]: I0320 11:11:29.951893 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:29 crc kubenswrapper[4860]: I0320 11:11:29.952862 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.007984 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.495759 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf"] Mar 20 11:11:30 crc kubenswrapper[4860]: E0320 11:11:30.496635 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerName="pull" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.496733 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerName="pull" Mar 20 11:11:30 crc kubenswrapper[4860]: E0320 11:11:30.496814 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerName="extract" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.496881 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerName="extract" Mar 20 11:11:30 crc kubenswrapper[4860]: E0320 11:11:30.496955 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerName="util" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.497016 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerName="util" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.497263 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da0294e-5ac5-4655-b882-cfd1f36ce791" containerName="extract" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.497927 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.501554 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.502017 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-lgrkw" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.502082 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.502082 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.503378 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.531750 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf"] Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.560108 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bb8f951b-6aa9-420c-9bad-dfa857482d4c-apiservice-cert\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.560268 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bb8f951b-6aa9-420c-9bad-dfa857482d4c-webhook-cert\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.560302 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27srv\" (UniqueName: \"kubernetes.io/projected/bb8f951b-6aa9-420c-9bad-dfa857482d4c-kube-api-access-27srv\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.661558 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bb8f951b-6aa9-420c-9bad-dfa857482d4c-apiservice-cert\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.661695 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bb8f951b-6aa9-420c-9bad-dfa857482d4c-webhook-cert\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.661723 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27srv\" (UniqueName: \"kubernetes.io/projected/bb8f951b-6aa9-420c-9bad-dfa857482d4c-kube-api-access-27srv\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.670370 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bb8f951b-6aa9-420c-9bad-dfa857482d4c-apiservice-cert\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.674583 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bb8f951b-6aa9-420c-9bad-dfa857482d4c-webhook-cert\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.682167 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27srv\" (UniqueName: \"kubernetes.io/projected/bb8f951b-6aa9-420c-9bad-dfa857482d4c-kube-api-access-27srv\") pod \"metallb-operator-controller-manager-5c589f6ccd-bcpmf\" (UID: \"bb8f951b-6aa9-420c-9bad-dfa857482d4c\") " pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.769482 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s"] Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.770881 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.773392 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-hzlvn" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.773704 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.774583 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.793678 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s"] Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.817212 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.864532 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1eb0189c-2177-4c4e-83f6-7ba051322847-apiservice-cert\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.864630 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfnxg\" (UniqueName: \"kubernetes.io/projected/1eb0189c-2177-4c4e-83f6-7ba051322847-kube-api-access-hfnxg\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.864667 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1eb0189c-2177-4c4e-83f6-7ba051322847-webhook-cert\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.937854 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.966400 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1eb0189c-2177-4c4e-83f6-7ba051322847-apiservice-cert\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.966515 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfnxg\" (UniqueName: \"kubernetes.io/projected/1eb0189c-2177-4c4e-83f6-7ba051322847-kube-api-access-hfnxg\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.966573 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1eb0189c-2177-4c4e-83f6-7ba051322847-webhook-cert\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.973607 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1eb0189c-2177-4c4e-83f6-7ba051322847-webhook-cert\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.973694 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1eb0189c-2177-4c4e-83f6-7ba051322847-apiservice-cert\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:30 crc kubenswrapper[4860]: I0320 11:11:30.997656 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfnxg\" (UniqueName: \"kubernetes.io/projected/1eb0189c-2177-4c4e-83f6-7ba051322847-kube-api-access-hfnxg\") pod \"metallb-operator-webhook-server-587dc5fb9c-2t48s\" (UID: \"1eb0189c-2177-4c4e-83f6-7ba051322847\") " pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.037728 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hmtwf"] Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.039114 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.054772 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hmtwf"] Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.089454 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.175300 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-catalog-content\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.175874 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-utilities\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.175908 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9trl\" (UniqueName: \"kubernetes.io/projected/422d4ef5-4317-467e-b7cb-e20258f2865d-kube-api-access-q9trl\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.208043 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf"] Mar 20 11:11:31 crc kubenswrapper[4860]: W0320 11:11:31.239750 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb8f951b_6aa9_420c_9bad_dfa857482d4c.slice/crio-1bc5c987d948a8cfbf2f21007a00b7f113bd7202f8a89641c803aee41d9e966d WatchSource:0}: Error finding container 1bc5c987d948a8cfbf2f21007a00b7f113bd7202f8a89641c803aee41d9e966d: Status 404 returned error can't find the container with id 1bc5c987d948a8cfbf2f21007a00b7f113bd7202f8a89641c803aee41d9e966d Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.277312 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-utilities\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.277369 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9trl\" (UniqueName: \"kubernetes.io/projected/422d4ef5-4317-467e-b7cb-e20258f2865d-kube-api-access-q9trl\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.277418 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-catalog-content\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.278122 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-catalog-content\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.278417 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-utilities\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.325305 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9trl\" (UniqueName: \"kubernetes.io/projected/422d4ef5-4317-467e-b7cb-e20258f2865d-kube-api-access-q9trl\") pod \"community-operators-hmtwf\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.361810 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.361868 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.389930 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.576835 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.611712 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s"] Mar 20 11:11:31 crc kubenswrapper[4860]: W0320 11:11:31.636816 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1eb0189c_2177_4c4e_83f6_7ba051322847.slice/crio-88cd98393c43887dd770ab7c2de91e79cfb9f8082dab97cb7fd8ce92e87bcf96 WatchSource:0}: Error finding container 88cd98393c43887dd770ab7c2de91e79cfb9f8082dab97cb7fd8ce92e87bcf96: Status 404 returned error can't find the container with id 88cd98393c43887dd770ab7c2de91e79cfb9f8082dab97cb7fd8ce92e87bcf96 Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.845837 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hmtwf"] Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.881476 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" event={"ID":"bb8f951b-6aa9-420c-9bad-dfa857482d4c","Type":"ContainerStarted","Data":"1bc5c987d948a8cfbf2f21007a00b7f113bd7202f8a89641c803aee41d9e966d"} Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.886832 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" event={"ID":"1eb0189c-2177-4c4e-83f6-7ba051322847","Type":"ContainerStarted","Data":"88cd98393c43887dd770ab7c2de91e79cfb9f8082dab97cb7fd8ce92e87bcf96"} Mar 20 11:11:31 crc kubenswrapper[4860]: I0320 11:11:31.973491 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:32 crc kubenswrapper[4860]: I0320 11:11:32.897784 4860 generic.go:334] "Generic (PLEG): container finished" podID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerID="e2ac6d203aeb92405284cfef9812c477008e52d2c2ddbae1ce0cb7bf131fb282" exitCode=0 Mar 20 11:11:32 crc kubenswrapper[4860]: I0320 11:11:32.898401 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmtwf" event={"ID":"422d4ef5-4317-467e-b7cb-e20258f2865d","Type":"ContainerDied","Data":"e2ac6d203aeb92405284cfef9812c477008e52d2c2ddbae1ce0cb7bf131fb282"} Mar 20 11:11:32 crc kubenswrapper[4860]: I0320 11:11:32.898599 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmtwf" event={"ID":"422d4ef5-4317-467e-b7cb-e20258f2865d","Type":"ContainerStarted","Data":"70c9e8ca00f897038aed84c357d0f86149e529ad64db01ad212da50e9a0df3da"} Mar 20 11:11:34 crc kubenswrapper[4860]: I0320 11:11:34.662456 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nskpj"] Mar 20 11:11:34 crc kubenswrapper[4860]: I0320 11:11:34.663169 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nskpj" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="registry-server" containerID="cri-o://42bad4c11bf889acf02f0318595f240e854802661f3002d50e2d7621ab40266d" gracePeriod=2 Mar 20 11:11:34 crc kubenswrapper[4860]: I0320 11:11:34.933183 4860 generic.go:334] "Generic (PLEG): container finished" podID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerID="42bad4c11bf889acf02f0318595f240e854802661f3002d50e2d7621ab40266d" exitCode=0 Mar 20 11:11:34 crc kubenswrapper[4860]: I0320 11:11:34.933249 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nskpj" event={"ID":"659f81bd-8f32-4dd2-9a63-6b2665fa6647","Type":"ContainerDied","Data":"42bad4c11bf889acf02f0318595f240e854802661f3002d50e2d7621ab40266d"} Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.818696 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.912405 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-utilities\") pod \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.912468 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-catalog-content\") pod \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.912488 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngdgx\" (UniqueName: \"kubernetes.io/projected/659f81bd-8f32-4dd2-9a63-6b2665fa6647-kube-api-access-ngdgx\") pod \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\" (UID: \"659f81bd-8f32-4dd2-9a63-6b2665fa6647\") " Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.915638 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-utilities" (OuterVolumeSpecName: "utilities") pod "659f81bd-8f32-4dd2-9a63-6b2665fa6647" (UID: "659f81bd-8f32-4dd2-9a63-6b2665fa6647"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.922093 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/659f81bd-8f32-4dd2-9a63-6b2665fa6647-kube-api-access-ngdgx" (OuterVolumeSpecName: "kube-api-access-ngdgx") pod "659f81bd-8f32-4dd2-9a63-6b2665fa6647" (UID: "659f81bd-8f32-4dd2-9a63-6b2665fa6647"). InnerVolumeSpecName "kube-api-access-ngdgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.942798 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" event={"ID":"bb8f951b-6aa9-420c-9bad-dfa857482d4c","Type":"ContainerStarted","Data":"2807ec143e9f4b047895d4a3fdab36e21f03e518540ddd30bffa5f634ec12d7f"} Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.943064 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.949510 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "659f81bd-8f32-4dd2-9a63-6b2665fa6647" (UID: "659f81bd-8f32-4dd2-9a63-6b2665fa6647"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.954076 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmtwf" event={"ID":"422d4ef5-4317-467e-b7cb-e20258f2865d","Type":"ContainerStarted","Data":"9323b3425e5b64abc66c4f89a77758eebaefc4a3e684f49cd55ba24c87a0c405"} Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.958246 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nskpj" event={"ID":"659f81bd-8f32-4dd2-9a63-6b2665fa6647","Type":"ContainerDied","Data":"41dfd9902a89b0dfeabbbcc8c55d12c6c1350b3b1f92b4a588be3bf82690038f"} Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.958351 4860 scope.go:117] "RemoveContainer" containerID="42bad4c11bf889acf02f0318595f240e854802661f3002d50e2d7621ab40266d" Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.958289 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nskpj" Mar 20 11:11:35 crc kubenswrapper[4860]: I0320 11:11:35.975720 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" podStartSLOduration=1.65905661 podStartE2EDuration="5.97569206s" podCreationTimestamp="2026-03-20 11:11:30 +0000 UTC" firstStartedPulling="2026-03-20 11:11:31.25613017 +0000 UTC m=+1015.477491058" lastFinishedPulling="2026-03-20 11:11:35.57276561 +0000 UTC m=+1019.794126508" observedRunningTime="2026-03-20 11:11:35.963341597 +0000 UTC m=+1020.184702485" watchObservedRunningTime="2026-03-20 11:11:35.97569206 +0000 UTC m=+1020.197052958" Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.003841 4860 scope.go:117] "RemoveContainer" containerID="c6814f68bcca89f3e902966194b072801e04bd2ac54dbd226c91e59deb878610" Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.014717 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.014763 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/659f81bd-8f32-4dd2-9a63-6b2665fa6647-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.014777 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngdgx\" (UniqueName: \"kubernetes.io/projected/659f81bd-8f32-4dd2-9a63-6b2665fa6647-kube-api-access-ngdgx\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.020699 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nskpj"] Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.035678 4860 scope.go:117] "RemoveContainer" containerID="a3af29e4cd182fe4ad8f9d7ac048c0de3c19c12bb591945ad5bc082f0dae2c7c" Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.047681 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nskpj"] Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.971814 4860 generic.go:334] "Generic (PLEG): container finished" podID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerID="9323b3425e5b64abc66c4f89a77758eebaefc4a3e684f49cd55ba24c87a0c405" exitCode=0 Mar 20 11:11:36 crc kubenswrapper[4860]: I0320 11:11:36.973301 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmtwf" event={"ID":"422d4ef5-4317-467e-b7cb-e20258f2865d","Type":"ContainerDied","Data":"9323b3425e5b64abc66c4f89a77758eebaefc4a3e684f49cd55ba24c87a0c405"} Mar 20 11:11:37 crc kubenswrapper[4860]: I0320 11:11:37.423463 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" path="/var/lib/kubelet/pods/659f81bd-8f32-4dd2-9a63-6b2665fa6647/volumes" Mar 20 11:11:37 crc kubenswrapper[4860]: I0320 11:11:37.816535 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jwpk7"] Mar 20 11:11:37 crc kubenswrapper[4860]: I0320 11:11:37.816838 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jwpk7" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="registry-server" containerID="cri-o://fcf22871cd5e74e6fe79c7e6a0618d042337435567daae70e294425c235cfefb" gracePeriod=2 Mar 20 11:11:38 crc kubenswrapper[4860]: I0320 11:11:38.003614 4860 generic.go:334] "Generic (PLEG): container finished" podID="c65a3188-9f0a-4525-b876-635822d8dda4" containerID="fcf22871cd5e74e6fe79c7e6a0618d042337435567daae70e294425c235cfefb" exitCode=0 Mar 20 11:11:38 crc kubenswrapper[4860]: I0320 11:11:38.003653 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwpk7" event={"ID":"c65a3188-9f0a-4525-b876-635822d8dda4","Type":"ContainerDied","Data":"fcf22871cd5e74e6fe79c7e6a0618d042337435567daae70e294425c235cfefb"} Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.191504 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.229395 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-catalog-content\") pod \"c65a3188-9f0a-4525-b876-635822d8dda4\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.229495 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-utilities\") pod \"c65a3188-9f0a-4525-b876-635822d8dda4\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.229521 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6jkn\" (UniqueName: \"kubernetes.io/projected/c65a3188-9f0a-4525-b876-635822d8dda4-kube-api-access-d6jkn\") pod \"c65a3188-9f0a-4525-b876-635822d8dda4\" (UID: \"c65a3188-9f0a-4525-b876-635822d8dda4\") " Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.236712 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-utilities" (OuterVolumeSpecName: "utilities") pod "c65a3188-9f0a-4525-b876-635822d8dda4" (UID: "c65a3188-9f0a-4525-b876-635822d8dda4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.258446 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c65a3188-9f0a-4525-b876-635822d8dda4-kube-api-access-d6jkn" (OuterVolumeSpecName: "kube-api-access-d6jkn") pod "c65a3188-9f0a-4525-b876-635822d8dda4" (UID: "c65a3188-9f0a-4525-b876-635822d8dda4"). InnerVolumeSpecName "kube-api-access-d6jkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.318455 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c65a3188-9f0a-4525-b876-635822d8dda4" (UID: "c65a3188-9f0a-4525-b876-635822d8dda4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.330923 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.330964 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c65a3188-9f0a-4525-b876-635822d8dda4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:39 crc kubenswrapper[4860]: I0320 11:11:39.330978 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6jkn\" (UniqueName: \"kubernetes.io/projected/c65a3188-9f0a-4525-b876-635822d8dda4-kube-api-access-d6jkn\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.018097 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmtwf" event={"ID":"422d4ef5-4317-467e-b7cb-e20258f2865d","Type":"ContainerStarted","Data":"2e40b057d300f1a934ecff02427dd0d8255ffaa10e640a70d306832ce1fd7b46"} Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.022022 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwpk7" event={"ID":"c65a3188-9f0a-4525-b876-635822d8dda4","Type":"ContainerDied","Data":"e6307789218fa3c6a03a6f158c301aa7e53133c1765dba4abc69e40da0ce5696"} Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.022080 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwpk7" Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.022137 4860 scope.go:117] "RemoveContainer" containerID="fcf22871cd5e74e6fe79c7e6a0618d042337435567daae70e294425c235cfefb" Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.024147 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" event={"ID":"1eb0189c-2177-4c4e-83f6-7ba051322847","Type":"ContainerStarted","Data":"e891085d91de217023d537d53a0d32496654849cd5143ddadaa349181dfda470"} Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.024905 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.043778 4860 scope.go:117] "RemoveContainer" containerID="e68be3bab36d85e5b37688fc943817973a421b95bb420a49b7aa9a1cf465b12b" Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.057699 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hmtwf" podStartSLOduration=3.619979625 podStartE2EDuration="10.057678196s" podCreationTimestamp="2026-03-20 11:11:30 +0000 UTC" firstStartedPulling="2026-03-20 11:11:32.899431643 +0000 UTC m=+1017.120792531" lastFinishedPulling="2026-03-20 11:11:39.337130204 +0000 UTC m=+1023.558491102" observedRunningTime="2026-03-20 11:11:40.052947278 +0000 UTC m=+1024.274308176" watchObservedRunningTime="2026-03-20 11:11:40.057678196 +0000 UTC m=+1024.279039094" Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.065516 4860 scope.go:117] "RemoveContainer" containerID="e14537304077b317ad670f8e2115905292466a260253edc43ac45ce35e816ef6" Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.071400 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jwpk7"] Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.084875 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jwpk7"] Mar 20 11:11:40 crc kubenswrapper[4860]: I0320 11:11:40.109269 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" podStartSLOduration=2.817079343 podStartE2EDuration="10.109224065s" podCreationTimestamp="2026-03-20 11:11:30 +0000 UTC" firstStartedPulling="2026-03-20 11:11:31.645376881 +0000 UTC m=+1015.866737779" lastFinishedPulling="2026-03-20 11:11:38.937521603 +0000 UTC m=+1023.158882501" observedRunningTime="2026-03-20 11:11:40.106687027 +0000 UTC m=+1024.328047935" watchObservedRunningTime="2026-03-20 11:11:40.109224065 +0000 UTC m=+1024.330584963" Mar 20 11:11:41 crc kubenswrapper[4860]: I0320 11:11:41.391408 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:41 crc kubenswrapper[4860]: I0320 11:11:41.391460 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:41 crc kubenswrapper[4860]: I0320 11:11:41.422008 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" path="/var/lib/kubelet/pods/c65a3188-9f0a-4525-b876-635822d8dda4/volumes" Mar 20 11:11:42 crc kubenswrapper[4860]: I0320 11:11:42.435692 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hmtwf" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="registry-server" probeResult="failure" output=< Mar 20 11:11:42 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 11:11:42 crc kubenswrapper[4860]: > Mar 20 11:11:51 crc kubenswrapper[4860]: I0320 11:11:51.095866 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-587dc5fb9c-2t48s" Mar 20 11:11:51 crc kubenswrapper[4860]: I0320 11:11:51.434007 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:51 crc kubenswrapper[4860]: I0320 11:11:51.483711 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:52 crc kubenswrapper[4860]: I0320 11:11:52.344737 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:11:52 crc kubenswrapper[4860]: I0320 11:11:52.344802 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:11:52 crc kubenswrapper[4860]: I0320 11:11:52.344855 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:11:52 crc kubenswrapper[4860]: I0320 11:11:52.345653 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8a778954d1374877671fc9bca86b7581cc1911487a943aba7bf61952dec5e818"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:11:52 crc kubenswrapper[4860]: I0320 11:11:52.345720 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://8a778954d1374877671fc9bca86b7581cc1911487a943aba7bf61952dec5e818" gracePeriod=600 Mar 20 11:11:53 crc kubenswrapper[4860]: I0320 11:11:53.124620 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="8a778954d1374877671fc9bca86b7581cc1911487a943aba7bf61952dec5e818" exitCode=0 Mar 20 11:11:53 crc kubenswrapper[4860]: I0320 11:11:53.124726 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"8a778954d1374877671fc9bca86b7581cc1911487a943aba7bf61952dec5e818"} Mar 20 11:11:53 crc kubenswrapper[4860]: I0320 11:11:53.125266 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"88d9c34d042546706542bef7e7c591b52fa1f874953861e5601a8c6aea607a26"} Mar 20 11:11:53 crc kubenswrapper[4860]: I0320 11:11:53.125306 4860 scope.go:117] "RemoveContainer" containerID="4b71d74fc211e299a940948264ea488965d39574d77d6b6b358fff8d3e35e4e3" Mar 20 11:11:54 crc kubenswrapper[4860]: I0320 11:11:54.805216 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hmtwf"] Mar 20 11:11:54 crc kubenswrapper[4860]: I0320 11:11:54.805998 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hmtwf" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="registry-server" containerID="cri-o://2e40b057d300f1a934ecff02427dd0d8255ffaa10e640a70d306832ce1fd7b46" gracePeriod=2 Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.146773 4860 generic.go:334] "Generic (PLEG): container finished" podID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerID="2e40b057d300f1a934ecff02427dd0d8255ffaa10e640a70d306832ce1fd7b46" exitCode=0 Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.146838 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmtwf" event={"ID":"422d4ef5-4317-467e-b7cb-e20258f2865d","Type":"ContainerDied","Data":"2e40b057d300f1a934ecff02427dd0d8255ffaa10e640a70d306832ce1fd7b46"} Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.304033 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.357129 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9trl\" (UniqueName: \"kubernetes.io/projected/422d4ef5-4317-467e-b7cb-e20258f2865d-kube-api-access-q9trl\") pod \"422d4ef5-4317-467e-b7cb-e20258f2865d\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.357208 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-catalog-content\") pod \"422d4ef5-4317-467e-b7cb-e20258f2865d\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.357248 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-utilities\") pod \"422d4ef5-4317-467e-b7cb-e20258f2865d\" (UID: \"422d4ef5-4317-467e-b7cb-e20258f2865d\") " Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.358625 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-utilities" (OuterVolumeSpecName: "utilities") pod "422d4ef5-4317-467e-b7cb-e20258f2865d" (UID: "422d4ef5-4317-467e-b7cb-e20258f2865d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.364590 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/422d4ef5-4317-467e-b7cb-e20258f2865d-kube-api-access-q9trl" (OuterVolumeSpecName: "kube-api-access-q9trl") pod "422d4ef5-4317-467e-b7cb-e20258f2865d" (UID: "422d4ef5-4317-467e-b7cb-e20258f2865d"). InnerVolumeSpecName "kube-api-access-q9trl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.411583 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "422d4ef5-4317-467e-b7cb-e20258f2865d" (UID: "422d4ef5-4317-467e-b7cb-e20258f2865d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.461083 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.461146 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/422d4ef5-4317-467e-b7cb-e20258f2865d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:55 crc kubenswrapper[4860]: I0320 11:11:55.461195 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9trl\" (UniqueName: \"kubernetes.io/projected/422d4ef5-4317-467e-b7cb-e20258f2865d-kube-api-access-q9trl\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:56 crc kubenswrapper[4860]: I0320 11:11:56.156966 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmtwf" event={"ID":"422d4ef5-4317-467e-b7cb-e20258f2865d","Type":"ContainerDied","Data":"70c9e8ca00f897038aed84c357d0f86149e529ad64db01ad212da50e9a0df3da"} Mar 20 11:11:56 crc kubenswrapper[4860]: I0320 11:11:56.157045 4860 scope.go:117] "RemoveContainer" containerID="2e40b057d300f1a934ecff02427dd0d8255ffaa10e640a70d306832ce1fd7b46" Mar 20 11:11:56 crc kubenswrapper[4860]: I0320 11:11:56.157067 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmtwf" Mar 20 11:11:56 crc kubenswrapper[4860]: I0320 11:11:56.177922 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hmtwf"] Mar 20 11:11:56 crc kubenswrapper[4860]: I0320 11:11:56.182393 4860 scope.go:117] "RemoveContainer" containerID="9323b3425e5b64abc66c4f89a77758eebaefc4a3e684f49cd55ba24c87a0c405" Mar 20 11:11:56 crc kubenswrapper[4860]: I0320 11:11:56.186556 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hmtwf"] Mar 20 11:11:56 crc kubenswrapper[4860]: I0320 11:11:56.207003 4860 scope.go:117] "RemoveContainer" containerID="e2ac6d203aeb92405284cfef9812c477008e52d2c2ddbae1ce0cb7bf131fb282" Mar 20 11:11:57 crc kubenswrapper[4860]: I0320 11:11:57.421340 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" path="/var/lib/kubelet/pods/422d4ef5-4317-467e-b7cb-e20258f2865d/volumes" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.136950 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566752-kn2qv"] Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.138802 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.138882 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.138944 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139002 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.139058 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139109 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.139169 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139218 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.139313 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139381 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.139457 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139517 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.139572 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139626 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.139683 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139741 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: E0320 11:12:00.139829 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.139884 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.140042 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="659f81bd-8f32-4dd2-9a63-6b2665fa6647" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.140107 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="c65a3188-9f0a-4525-b876-635822d8dda4" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.140166 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="422d4ef5-4317-467e-b7cb-e20258f2865d" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.140727 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.146121 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.149918 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.151209 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.152105 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-kn2qv"] Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.322545 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhg5p\" (UniqueName: \"kubernetes.io/projected/fdc939b6-92ac-4e00-ae32-b518e4257043-kube-api-access-vhg5p\") pod \"auto-csr-approver-29566752-kn2qv\" (UID: \"fdc939b6-92ac-4e00-ae32-b518e4257043\") " pod="openshift-infra/auto-csr-approver-29566752-kn2qv" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.424369 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhg5p\" (UniqueName: \"kubernetes.io/projected/fdc939b6-92ac-4e00-ae32-b518e4257043-kube-api-access-vhg5p\") pod \"auto-csr-approver-29566752-kn2qv\" (UID: \"fdc939b6-92ac-4e00-ae32-b518e4257043\") " pod="openshift-infra/auto-csr-approver-29566752-kn2qv" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.447910 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhg5p\" (UniqueName: \"kubernetes.io/projected/fdc939b6-92ac-4e00-ae32-b518e4257043-kube-api-access-vhg5p\") pod \"auto-csr-approver-29566752-kn2qv\" (UID: \"fdc939b6-92ac-4e00-ae32-b518e4257043\") " pod="openshift-infra/auto-csr-approver-29566752-kn2qv" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.460877 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" Mar 20 11:12:00 crc kubenswrapper[4860]: I0320 11:12:00.895960 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-kn2qv"] Mar 20 11:12:01 crc kubenswrapper[4860]: I0320 11:12:01.191126 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" event={"ID":"fdc939b6-92ac-4e00-ae32-b518e4257043","Type":"ContainerStarted","Data":"6dd301a3f869b503ba08b5f916f05897dc806b4490317b136fd4c55c00db64b9"} Mar 20 11:12:02 crc kubenswrapper[4860]: I0320 11:12:02.199915 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" event={"ID":"fdc939b6-92ac-4e00-ae32-b518e4257043","Type":"ContainerStarted","Data":"986b40f9c6d66be5183f5bf7b868f2a3962c56f81df4ee2138cb170b1b825e18"} Mar 20 11:12:02 crc kubenswrapper[4860]: I0320 11:12:02.220820 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" podStartSLOduration=1.418901302 podStartE2EDuration="2.220797147s" podCreationTimestamp="2026-03-20 11:12:00 +0000 UTC" firstStartedPulling="2026-03-20 11:12:00.910742916 +0000 UTC m=+1045.132103814" lastFinishedPulling="2026-03-20 11:12:01.712638761 +0000 UTC m=+1045.933999659" observedRunningTime="2026-03-20 11:12:02.217149078 +0000 UTC m=+1046.438509986" watchObservedRunningTime="2026-03-20 11:12:02.220797147 +0000 UTC m=+1046.442158045" Mar 20 11:12:03 crc kubenswrapper[4860]: I0320 11:12:03.216127 4860 generic.go:334] "Generic (PLEG): container finished" podID="fdc939b6-92ac-4e00-ae32-b518e4257043" containerID="986b40f9c6d66be5183f5bf7b868f2a3962c56f81df4ee2138cb170b1b825e18" exitCode=0 Mar 20 11:12:03 crc kubenswrapper[4860]: I0320 11:12:03.216249 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" event={"ID":"fdc939b6-92ac-4e00-ae32-b518e4257043","Type":"ContainerDied","Data":"986b40f9c6d66be5183f5bf7b868f2a3962c56f81df4ee2138cb170b1b825e18"} Mar 20 11:12:04 crc kubenswrapper[4860]: I0320 11:12:04.478541 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" Mar 20 11:12:04 crc kubenswrapper[4860]: I0320 11:12:04.481387 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhg5p\" (UniqueName: \"kubernetes.io/projected/fdc939b6-92ac-4e00-ae32-b518e4257043-kube-api-access-vhg5p\") pod \"fdc939b6-92ac-4e00-ae32-b518e4257043\" (UID: \"fdc939b6-92ac-4e00-ae32-b518e4257043\") " Mar 20 11:12:04 crc kubenswrapper[4860]: I0320 11:12:04.490492 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdc939b6-92ac-4e00-ae32-b518e4257043-kube-api-access-vhg5p" (OuterVolumeSpecName: "kube-api-access-vhg5p") pod "fdc939b6-92ac-4e00-ae32-b518e4257043" (UID: "fdc939b6-92ac-4e00-ae32-b518e4257043"). InnerVolumeSpecName "kube-api-access-vhg5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:12:04 crc kubenswrapper[4860]: I0320 11:12:04.582984 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhg5p\" (UniqueName: \"kubernetes.io/projected/fdc939b6-92ac-4e00-ae32-b518e4257043-kube-api-access-vhg5p\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:05 crc kubenswrapper[4860]: I0320 11:12:05.232412 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" event={"ID":"fdc939b6-92ac-4e00-ae32-b518e4257043","Type":"ContainerDied","Data":"6dd301a3f869b503ba08b5f916f05897dc806b4490317b136fd4c55c00db64b9"} Mar 20 11:12:05 crc kubenswrapper[4860]: I0320 11:12:05.232470 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dd301a3f869b503ba08b5f916f05897dc806b4490317b136fd4c55c00db64b9" Mar 20 11:12:05 crc kubenswrapper[4860]: I0320 11:12:05.232535 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-kn2qv" Mar 20 11:12:05 crc kubenswrapper[4860]: I0320 11:12:05.274730 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-xphvf"] Mar 20 11:12:05 crc kubenswrapper[4860]: I0320 11:12:05.278158 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-xphvf"] Mar 20 11:12:05 crc kubenswrapper[4860]: I0320 11:12:05.420387 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="200c6cd9-8753-4805-9a49-50d3e429ea33" path="/var/lib/kubelet/pods/200c6cd9-8753-4805-9a49-50d3e429ea33/volumes" Mar 20 11:12:10 crc kubenswrapper[4860]: I0320 11:12:10.820670 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5c589f6ccd-bcpmf" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.472748 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-mzzpz"] Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.473531 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc939b6-92ac-4e00-ae32-b518e4257043" containerName="oc" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.473549 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc939b6-92ac-4e00-ae32-b518e4257043" containerName="oc" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.473689 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc939b6-92ac-4e00-ae32-b518e4257043" containerName="oc" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.476084 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.478441 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.478608 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-wjgxm" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.483856 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.488916 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx"] Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.489773 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.493368 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-reloader\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.493437 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-metrics\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.493485 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dcd69c7f-fded-4b09-bd44-607b27716196-frr-startup\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.493515 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzqwf\" (UniqueName: \"kubernetes.io/projected/dcd69c7f-fded-4b09-bd44-607b27716196-kube-api-access-wzqwf\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.493572 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd69c7f-fded-4b09-bd44-607b27716196-metrics-certs\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.493602 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-frr-conf\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.493662 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-frr-sockets\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.495789 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.509785 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx"] Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.581039 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-brjk7"] Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.582079 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.584017 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-kn7gq" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.584331 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.584496 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.585508 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.592291 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-tdcbt"] Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.593566 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594640 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jhncx\" (UID: \"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594700 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcfb9\" (UniqueName: \"kubernetes.io/projected/6ee4e9c2-66c1-4431-bde4-29d09a044a32-kube-api-access-vcfb9\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594751 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metallb-excludel2\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594802 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd69c7f-fded-4b09-bd44-607b27716196-metrics-certs\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594832 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-frr-conf\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594872 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-frr-sockets\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594897 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-reloader\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594929 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-metrics\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594951 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7q7r\" (UniqueName: \"kubernetes.io/projected/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-kube-api-access-n7q7r\") pod \"frr-k8s-webhook-server-bcc4b6f68-jhncx\" (UID: \"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.594978 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metrics-certs\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.595005 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.595083 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dcd69c7f-fded-4b09-bd44-607b27716196-frr-startup\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.595110 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzqwf\" (UniqueName: \"kubernetes.io/projected/dcd69c7f-fded-4b09-bd44-607b27716196-kube-api-access-wzqwf\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.595739 4860 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.595824 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcd69c7f-fded-4b09-bd44-607b27716196-metrics-certs podName:dcd69c7f-fded-4b09-bd44-607b27716196 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:12.09580293 +0000 UTC m=+1056.317163828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcd69c7f-fded-4b09-bd44-607b27716196-metrics-certs") pod "frr-k8s-mzzpz" (UID: "dcd69c7f-fded-4b09-bd44-607b27716196") : secret "frr-k8s-certs-secret" not found Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.595901 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-frr-sockets\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.596037 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-reloader\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.596130 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-frr-conf\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.596191 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dcd69c7f-fded-4b09-bd44-607b27716196-metrics\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.596489 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.597286 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dcd69c7f-fded-4b09-bd44-607b27716196-frr-startup\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.606952 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-tdcbt"] Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.645140 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzqwf\" (UniqueName: \"kubernetes.io/projected/dcd69c7f-fded-4b09-bd44-607b27716196-kube-api-access-wzqwf\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696503 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k8pc\" (UniqueName: \"kubernetes.io/projected/a2a3b82e-416b-4757-8719-97c58493428e-kube-api-access-5k8pc\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696622 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2a3b82e-416b-4757-8719-97c58493428e-metrics-certs\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696674 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7q7r\" (UniqueName: \"kubernetes.io/projected/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-kube-api-access-n7q7r\") pod \"frr-k8s-webhook-server-bcc4b6f68-jhncx\" (UID: \"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696700 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metrics-certs\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696724 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696773 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcfb9\" (UniqueName: \"kubernetes.io/projected/6ee4e9c2-66c1-4431-bde4-29d09a044a32-kube-api-access-vcfb9\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696800 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jhncx\" (UID: \"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696833 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metallb-excludel2\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.696864 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2a3b82e-416b-4757-8719-97c58493428e-cert\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.696872 4860 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.696959 4860 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.697005 4860 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.696961 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metrics-certs podName:6ee4e9c2-66c1-4431-bde4-29d09a044a32 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:12.196936846 +0000 UTC m=+1056.418297744 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metrics-certs") pod "speaker-brjk7" (UID: "6ee4e9c2-66c1-4431-bde4-29d09a044a32") : secret "speaker-certs-secret" not found Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.697176 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist podName:6ee4e9c2-66c1-4431-bde4-29d09a044a32 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:12.197140441 +0000 UTC m=+1056.418501469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist") pod "speaker-brjk7" (UID: "6ee4e9c2-66c1-4431-bde4-29d09a044a32") : secret "metallb-memberlist" not found Mar 20 11:12:11 crc kubenswrapper[4860]: E0320 11:12:11.697205 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-cert podName:3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de nodeName:}" failed. No retries permitted until 2026-03-20 11:12:12.197195653 +0000 UTC m=+1056.418556781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-cert") pod "frr-k8s-webhook-server-bcc4b6f68-jhncx" (UID: "3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de") : secret "frr-k8s-webhook-server-cert" not found Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.698368 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metallb-excludel2\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.720141 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcfb9\" (UniqueName: \"kubernetes.io/projected/6ee4e9c2-66c1-4431-bde4-29d09a044a32-kube-api-access-vcfb9\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.725880 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7q7r\" (UniqueName: \"kubernetes.io/projected/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-kube-api-access-n7q7r\") pod \"frr-k8s-webhook-server-bcc4b6f68-jhncx\" (UID: \"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.798477 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2a3b82e-416b-4757-8719-97c58493428e-cert\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.799016 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k8pc\" (UniqueName: \"kubernetes.io/projected/a2a3b82e-416b-4757-8719-97c58493428e-kube-api-access-5k8pc\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.799156 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2a3b82e-416b-4757-8719-97c58493428e-metrics-certs\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.800358 4860 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.803528 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2a3b82e-416b-4757-8719-97c58493428e-metrics-certs\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.816562 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2a3b82e-416b-4757-8719-97c58493428e-cert\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.820000 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k8pc\" (UniqueName: \"kubernetes.io/projected/a2a3b82e-416b-4757-8719-97c58493428e-kube-api-access-5k8pc\") pod \"controller-7bb4cc7c98-tdcbt\" (UID: \"a2a3b82e-416b-4757-8719-97c58493428e\") " pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:11 crc kubenswrapper[4860]: I0320 11:12:11.911829 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.106130 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd69c7f-fded-4b09-bd44-607b27716196-metrics-certs\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.118998 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd69c7f-fded-4b09-bd44-607b27716196-metrics-certs\") pod \"frr-k8s-mzzpz\" (UID: \"dcd69c7f-fded-4b09-bd44-607b27716196\") " pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.183372 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-tdcbt"] Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.207040 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.207138 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jhncx\" (UID: \"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.207210 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metrics-certs\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:12 crc kubenswrapper[4860]: E0320 11:12:12.208328 4860 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 11:12:12 crc kubenswrapper[4860]: E0320 11:12:12.208430 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist podName:6ee4e9c2-66c1-4431-bde4-29d09a044a32 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:13.208406772 +0000 UTC m=+1057.429767840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist") pod "speaker-brjk7" (UID: "6ee4e9c2-66c1-4431-bde4-29d09a044a32") : secret "metallb-memberlist" not found Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.218040 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-metrics-certs\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.219126 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jhncx\" (UID: \"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.277490 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-tdcbt" event={"ID":"a2a3b82e-416b-4757-8719-97c58493428e","Type":"ContainerStarted","Data":"77826e25b7df8d41ed875fa4900194c9eb97fab2d3708c6b9b4ab8ffb23d5194"} Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.396190 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.417026 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:12 crc kubenswrapper[4860]: I0320 11:12:12.664168 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx"] Mar 20 11:12:12 crc kubenswrapper[4860]: W0320 11:12:12.676512 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c1b54d0_fcfb_451b_ae3d_b731d3f9f6de.slice/crio-1d54cd7a8a80089e09ea62871356442a1db2f62947d67ce9bdcbbacb402b4458 WatchSource:0}: Error finding container 1d54cd7a8a80089e09ea62871356442a1db2f62947d67ce9bdcbbacb402b4458: Status 404 returned error can't find the container with id 1d54cd7a8a80089e09ea62871356442a1db2f62947d67ce9bdcbbacb402b4458 Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.231214 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.239048 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6ee4e9c2-66c1-4431-bde4-29d09a044a32-memberlist\") pod \"speaker-brjk7\" (UID: \"6ee4e9c2-66c1-4431-bde4-29d09a044a32\") " pod="metallb-system/speaker-brjk7" Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.285650 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerStarted","Data":"9a5a2a81788b3204aab5ede788399eaab2ede7ab52823dbaa8a0fe1d9a7e8f70"} Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.286723 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" event={"ID":"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de","Type":"ContainerStarted","Data":"1d54cd7a8a80089e09ea62871356442a1db2f62947d67ce9bdcbbacb402b4458"} Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.289456 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-tdcbt" event={"ID":"a2a3b82e-416b-4757-8719-97c58493428e","Type":"ContainerStarted","Data":"37874dc77ffe970aa159f172c1f5236af3dd2e530af7ff74f729ce2d30ceae32"} Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.289519 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-tdcbt" event={"ID":"a2a3b82e-416b-4757-8719-97c58493428e","Type":"ContainerStarted","Data":"9069102d7168139a81f69619e9ce7652a29e25d97c2adad87a3381640d2a24e3"} Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.289862 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.313826 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-tdcbt" podStartSLOduration=2.313802686 podStartE2EDuration="2.313802686s" podCreationTimestamp="2026-03-20 11:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:12:13.307827405 +0000 UTC m=+1057.529188303" watchObservedRunningTime="2026-03-20 11:12:13.313802686 +0000 UTC m=+1057.535163584" Mar 20 11:12:13 crc kubenswrapper[4860]: I0320 11:12:13.403459 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-brjk7" Mar 20 11:12:13 crc kubenswrapper[4860]: W0320 11:12:13.432424 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ee4e9c2_66c1_4431_bde4_29d09a044a32.slice/crio-11ee8642eb2d5dfb2a282c972c8f9b64e10cb5afad30ac7cb08a84d1a8d79288 WatchSource:0}: Error finding container 11ee8642eb2d5dfb2a282c972c8f9b64e10cb5afad30ac7cb08a84d1a8d79288: Status 404 returned error can't find the container with id 11ee8642eb2d5dfb2a282c972c8f9b64e10cb5afad30ac7cb08a84d1a8d79288 Mar 20 11:12:14 crc kubenswrapper[4860]: I0320 11:12:14.307760 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-brjk7" event={"ID":"6ee4e9c2-66c1-4431-bde4-29d09a044a32","Type":"ContainerStarted","Data":"dba30ebe5d612295a23d92d89cb16de60bbd5d2659c7cf3b10afd3a35d49915e"} Mar 20 11:12:14 crc kubenswrapper[4860]: I0320 11:12:14.308395 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-brjk7" event={"ID":"6ee4e9c2-66c1-4431-bde4-29d09a044a32","Type":"ContainerStarted","Data":"90cd72b4fe99d365a38733a8e261e3e8ea8f26af36dbf447b600e30f50309ec3"} Mar 20 11:12:14 crc kubenswrapper[4860]: I0320 11:12:14.308436 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-brjk7" event={"ID":"6ee4e9c2-66c1-4431-bde4-29d09a044a32","Type":"ContainerStarted","Data":"11ee8642eb2d5dfb2a282c972c8f9b64e10cb5afad30ac7cb08a84d1a8d79288"} Mar 20 11:12:14 crc kubenswrapper[4860]: I0320 11:12:14.308671 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-brjk7" Mar 20 11:12:14 crc kubenswrapper[4860]: I0320 11:12:14.335792 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-brjk7" podStartSLOduration=3.335759072 podStartE2EDuration="3.335759072s" podCreationTimestamp="2026-03-20 11:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:12:14.32973444 +0000 UTC m=+1058.551095338" watchObservedRunningTime="2026-03-20 11:12:14.335759072 +0000 UTC m=+1058.557119970" Mar 20 11:12:23 crc kubenswrapper[4860]: I0320 11:12:23.434197 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-brjk7" Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.629363 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" event={"ID":"3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de","Type":"ContainerStarted","Data":"eed5c36b42d9cf47b25f131ff575301fdec9e430a45f5f8e32ec4a722ec37b75"} Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.630830 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.632880 4860 generic.go:334] "Generic (PLEG): container finished" podID="dcd69c7f-fded-4b09-bd44-607b27716196" containerID="4dfb9e8ea6a38a44d2ab27fd0047c19010d14d1dae6cef2b3ad3a660a1b7d2c8" exitCode=0 Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.632910 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerDied","Data":"4dfb9e8ea6a38a44d2ab27fd0047c19010d14d1dae6cef2b3ad3a660a1b7d2c8"} Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.654989 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" podStartSLOduration=3.028546661 podStartE2EDuration="13.654962904s" podCreationTimestamp="2026-03-20 11:12:11 +0000 UTC" firstStartedPulling="2026-03-20 11:12:12.679175801 +0000 UTC m=+1056.900536699" lastFinishedPulling="2026-03-20 11:12:23.305592044 +0000 UTC m=+1067.526952942" observedRunningTime="2026-03-20 11:12:24.651403628 +0000 UTC m=+1068.872764526" watchObservedRunningTime="2026-03-20 11:12:24.654962904 +0000 UTC m=+1068.876323832" Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.858481 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2"] Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.859984 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.863885 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.870336 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2"] Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.980586 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.980667 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:24 crc kubenswrapper[4860]: I0320 11:12:24.980720 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfjkl\" (UniqueName: \"kubernetes.io/projected/ea4498dd-681d-4260-b895-06e53dbcc9b9-kube-api-access-rfjkl\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.082486 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.082570 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfjkl\" (UniqueName: \"kubernetes.io/projected/ea4498dd-681d-4260-b895-06e53dbcc9b9-kube-api-access-rfjkl\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.082676 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.083139 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.083179 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.104139 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfjkl\" (UniqueName: \"kubernetes.io/projected/ea4498dd-681d-4260-b895-06e53dbcc9b9-kube-api-access-rfjkl\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.176068 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.425534 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2"] Mar 20 11:12:25 crc kubenswrapper[4860]: W0320 11:12:25.425712 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea4498dd_681d_4260_b895_06e53dbcc9b9.slice/crio-5cccc6d4f32f03d93feac9521857e367dfd46c76a0c04d5f4c410c0df53a068f WatchSource:0}: Error finding container 5cccc6d4f32f03d93feac9521857e367dfd46c76a0c04d5f4c410c0df53a068f: Status 404 returned error can't find the container with id 5cccc6d4f32f03d93feac9521857e367dfd46c76a0c04d5f4c410c0df53a068f Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.643394 4860 generic.go:334] "Generic (PLEG): container finished" podID="dcd69c7f-fded-4b09-bd44-607b27716196" containerID="aaefc127b3b814fda46a4fac6c09a1fa82786e8924d06b62a458ba7f97d8564f" exitCode=0 Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.643671 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerDied","Data":"aaefc127b3b814fda46a4fac6c09a1fa82786e8924d06b62a458ba7f97d8564f"} Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.655338 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" event={"ID":"ea4498dd-681d-4260-b895-06e53dbcc9b9","Type":"ContainerStarted","Data":"9681f51161892af9e576f0f4da65bd4a9a88256f64c337c54af28c08b7ea3d52"} Mar 20 11:12:25 crc kubenswrapper[4860]: I0320 11:12:25.655780 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" event={"ID":"ea4498dd-681d-4260-b895-06e53dbcc9b9","Type":"ContainerStarted","Data":"5cccc6d4f32f03d93feac9521857e367dfd46c76a0c04d5f4c410c0df53a068f"} Mar 20 11:12:26 crc kubenswrapper[4860]: I0320 11:12:26.664380 4860 generic.go:334] "Generic (PLEG): container finished" podID="dcd69c7f-fded-4b09-bd44-607b27716196" containerID="910adb48feabecb49b3a0d84e41546776436b3ec06da2344fee430676505c2cc" exitCode=0 Mar 20 11:12:26 crc kubenswrapper[4860]: I0320 11:12:26.664498 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerDied","Data":"910adb48feabecb49b3a0d84e41546776436b3ec06da2344fee430676505c2cc"} Mar 20 11:12:26 crc kubenswrapper[4860]: I0320 11:12:26.666246 4860 generic.go:334] "Generic (PLEG): container finished" podID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerID="9681f51161892af9e576f0f4da65bd4a9a88256f64c337c54af28c08b7ea3d52" exitCode=0 Mar 20 11:12:26 crc kubenswrapper[4860]: I0320 11:12:26.666358 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" event={"ID":"ea4498dd-681d-4260-b895-06e53dbcc9b9","Type":"ContainerDied","Data":"9681f51161892af9e576f0f4da65bd4a9a88256f64c337c54af28c08b7ea3d52"} Mar 20 11:12:27 crc kubenswrapper[4860]: I0320 11:12:27.678590 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerStarted","Data":"6a17b8ad777c84daa1bb9c2527a79acc31ee9625f9a1ca1c55538d2aed2c2f03"} Mar 20 11:12:27 crc kubenswrapper[4860]: I0320 11:12:27.679567 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerStarted","Data":"283879f78be2c3c05ef7653b32c23325593b55e264572056cea75e01c82175cd"} Mar 20 11:12:27 crc kubenswrapper[4860]: I0320 11:12:27.679584 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerStarted","Data":"d953d2007587ca380c18421211fad6db96e5a8196beab17433802aa2491647a6"} Mar 20 11:12:27 crc kubenswrapper[4860]: I0320 11:12:27.679597 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerStarted","Data":"e31ef6f22e13c0c51b83b1537fdd286af21e3e29728a205a071567c260c7496a"} Mar 20 11:12:27 crc kubenswrapper[4860]: I0320 11:12:27.679614 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerStarted","Data":"5f2bfe9d7a2cd3d2a5943cb7ea54c59e59831333e9b9dab3ffc97cd2b6e20e93"} Mar 20 11:12:28 crc kubenswrapper[4860]: I0320 11:12:28.692664 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mzzpz" event={"ID":"dcd69c7f-fded-4b09-bd44-607b27716196","Type":"ContainerStarted","Data":"e9dbd54f243509b30ec00213e9b5984ca62a3745f44b34c82fc394f17dbcae44"} Mar 20 11:12:28 crc kubenswrapper[4860]: I0320 11:12:28.692910 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:28 crc kubenswrapper[4860]: I0320 11:12:28.717868 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-mzzpz" podStartSLOduration=6.884554816 podStartE2EDuration="17.717844665s" podCreationTimestamp="2026-03-20 11:12:11 +0000 UTC" firstStartedPulling="2026-03-20 11:12:12.490946678 +0000 UTC m=+1056.712307576" lastFinishedPulling="2026-03-20 11:12:23.324236527 +0000 UTC m=+1067.545597425" observedRunningTime="2026-03-20 11:12:28.716668063 +0000 UTC m=+1072.938028961" watchObservedRunningTime="2026-03-20 11:12:28.717844665 +0000 UTC m=+1072.939205563" Mar 20 11:12:31 crc kubenswrapper[4860]: I0320 11:12:31.918329 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-tdcbt" Mar 20 11:12:32 crc kubenswrapper[4860]: I0320 11:12:32.397555 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:32 crc kubenswrapper[4860]: I0320 11:12:32.440646 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:32 crc kubenswrapper[4860]: I0320 11:12:32.753669 4860 generic.go:334] "Generic (PLEG): container finished" podID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerID="2716a4ea9ee047cc78eaeeb30ebecda79f9eb56973079b8f9647e018e103efe1" exitCode=0 Mar 20 11:12:32 crc kubenswrapper[4860]: I0320 11:12:32.753746 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" event={"ID":"ea4498dd-681d-4260-b895-06e53dbcc9b9","Type":"ContainerDied","Data":"2716a4ea9ee047cc78eaeeb30ebecda79f9eb56973079b8f9647e018e103efe1"} Mar 20 11:12:33 crc kubenswrapper[4860]: I0320 11:12:33.765820 4860 generic.go:334] "Generic (PLEG): container finished" podID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerID="db333eaee23207b52536e059c0067f4d72cc446172e50bde67fa1437447b5635" exitCode=0 Mar 20 11:12:33 crc kubenswrapper[4860]: I0320 11:12:33.765902 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" event={"ID":"ea4498dd-681d-4260-b895-06e53dbcc9b9","Type":"ContainerDied","Data":"db333eaee23207b52536e059c0067f4d72cc446172e50bde67fa1437447b5635"} Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.026697 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.202090 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-util\") pod \"ea4498dd-681d-4260-b895-06e53dbcc9b9\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.202318 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfjkl\" (UniqueName: \"kubernetes.io/projected/ea4498dd-681d-4260-b895-06e53dbcc9b9-kube-api-access-rfjkl\") pod \"ea4498dd-681d-4260-b895-06e53dbcc9b9\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.202391 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-bundle\") pod \"ea4498dd-681d-4260-b895-06e53dbcc9b9\" (UID: \"ea4498dd-681d-4260-b895-06e53dbcc9b9\") " Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.206212 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-bundle" (OuterVolumeSpecName: "bundle") pod "ea4498dd-681d-4260-b895-06e53dbcc9b9" (UID: "ea4498dd-681d-4260-b895-06e53dbcc9b9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.222537 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-util" (OuterVolumeSpecName: "util") pod "ea4498dd-681d-4260-b895-06e53dbcc9b9" (UID: "ea4498dd-681d-4260-b895-06e53dbcc9b9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.223219 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea4498dd-681d-4260-b895-06e53dbcc9b9-kube-api-access-rfjkl" (OuterVolumeSpecName: "kube-api-access-rfjkl") pod "ea4498dd-681d-4260-b895-06e53dbcc9b9" (UID: "ea4498dd-681d-4260-b895-06e53dbcc9b9"). InnerVolumeSpecName "kube-api-access-rfjkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.305340 4860 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-util\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.305425 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfjkl\" (UniqueName: \"kubernetes.io/projected/ea4498dd-681d-4260-b895-06e53dbcc9b9-kube-api-access-rfjkl\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.305439 4860 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea4498dd-681d-4260-b895-06e53dbcc9b9-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.782608 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" event={"ID":"ea4498dd-681d-4260-b895-06e53dbcc9b9","Type":"ContainerDied","Data":"5cccc6d4f32f03d93feac9521857e367dfd46c76a0c04d5f4c410c0df53a068f"} Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.782661 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cccc6d4f32f03d93feac9521857e367dfd46c76a0c04d5f4c410c0df53a068f" Mar 20 11:12:35 crc kubenswrapper[4860]: I0320 11:12:35.782712 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2" Mar 20 11:12:41 crc kubenswrapper[4860]: I0320 11:12:41.419051 4860 scope.go:117] "RemoveContainer" containerID="6b40403be918a788bbcc242393eb71ec98682fddffb9062133713238970f5b03" Mar 20 11:12:42 crc kubenswrapper[4860]: I0320 11:12:42.408254 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-mzzpz" Mar 20 11:12:42 crc kubenswrapper[4860]: I0320 11:12:42.456183 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jhncx" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.094195 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz"] Mar 20 11:12:43 crc kubenswrapper[4860]: E0320 11:12:43.094531 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerName="extract" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.094550 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerName="extract" Mar 20 11:12:43 crc kubenswrapper[4860]: E0320 11:12:43.094575 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerName="pull" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.094584 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerName="pull" Mar 20 11:12:43 crc kubenswrapper[4860]: E0320 11:12:43.094593 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerName="util" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.094601 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerName="util" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.094714 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea4498dd-681d-4260-b895-06e53dbcc9b9" containerName="extract" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.095265 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.098146 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.098280 4860 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-7rn4v" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.098403 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.113583 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz"] Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.222577 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/630d2077-3457-4dcf-b9ab-82f77e819c54-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-h84nz\" (UID: \"630d2077-3457-4dcf-b9ab-82f77e819c54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.222703 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fz7w\" (UniqueName: \"kubernetes.io/projected/630d2077-3457-4dcf-b9ab-82f77e819c54-kube-api-access-2fz7w\") pod \"cert-manager-operator-controller-manager-66c8bdd694-h84nz\" (UID: \"630d2077-3457-4dcf-b9ab-82f77e819c54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.323797 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/630d2077-3457-4dcf-b9ab-82f77e819c54-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-h84nz\" (UID: \"630d2077-3457-4dcf-b9ab-82f77e819c54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.323873 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fz7w\" (UniqueName: \"kubernetes.io/projected/630d2077-3457-4dcf-b9ab-82f77e819c54-kube-api-access-2fz7w\") pod \"cert-manager-operator-controller-manager-66c8bdd694-h84nz\" (UID: \"630d2077-3457-4dcf-b9ab-82f77e819c54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.324466 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/630d2077-3457-4dcf-b9ab-82f77e819c54-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-h84nz\" (UID: \"630d2077-3457-4dcf-b9ab-82f77e819c54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.346929 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fz7w\" (UniqueName: \"kubernetes.io/projected/630d2077-3457-4dcf-b9ab-82f77e819c54-kube-api-access-2fz7w\") pod \"cert-manager-operator-controller-manager-66c8bdd694-h84nz\" (UID: \"630d2077-3457-4dcf-b9ab-82f77e819c54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.413057 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.733589 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz"] Mar 20 11:12:43 crc kubenswrapper[4860]: I0320 11:12:43.850653 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" event={"ID":"630d2077-3457-4dcf-b9ab-82f77e819c54","Type":"ContainerStarted","Data":"18d35c19543363cefab52b399f58cdc08f4d5374500a57ffc30c5c217835a5c5"} Mar 20 11:12:49 crc kubenswrapper[4860]: I0320 11:12:49.898732 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" event={"ID":"630d2077-3457-4dcf-b9ab-82f77e819c54","Type":"ContainerStarted","Data":"1eb7ab2639b189b7ac10546de5366184b490d9738bbeb42179304524516fad21"} Mar 20 11:12:49 crc kubenswrapper[4860]: I0320 11:12:49.923075 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-h84nz" podStartSLOduration=0.994380748 podStartE2EDuration="6.923056279s" podCreationTimestamp="2026-03-20 11:12:43 +0000 UTC" firstStartedPulling="2026-03-20 11:12:43.748578863 +0000 UTC m=+1087.969939761" lastFinishedPulling="2026-03-20 11:12:49.677254394 +0000 UTC m=+1093.898615292" observedRunningTime="2026-03-20 11:12:49.916859732 +0000 UTC m=+1094.138220630" watchObservedRunningTime="2026-03-20 11:12:49.923056279 +0000 UTC m=+1094.144417177" Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.689199 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-skhrl"] Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.691149 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.698373 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-skhrl"] Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.699299 4860 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-t9rhk" Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.699704 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.699968 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.930198 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4587c778-c12c-48e0-8c28-7eb7a7c1b722-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-skhrl\" (UID: \"4587c778-c12c-48e0-8c28-7eb7a7c1b722\") " pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:52 crc kubenswrapper[4860]: I0320 11:12:52.930335 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwgwb\" (UniqueName: \"kubernetes.io/projected/4587c778-c12c-48e0-8c28-7eb7a7c1b722-kube-api-access-rwgwb\") pod \"cert-manager-webhook-6888856db4-skhrl\" (UID: \"4587c778-c12c-48e0-8c28-7eb7a7c1b722\") " pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:53 crc kubenswrapper[4860]: I0320 11:12:53.031940 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4587c778-c12c-48e0-8c28-7eb7a7c1b722-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-skhrl\" (UID: \"4587c778-c12c-48e0-8c28-7eb7a7c1b722\") " pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:53 crc kubenswrapper[4860]: I0320 11:12:53.032007 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwgwb\" (UniqueName: \"kubernetes.io/projected/4587c778-c12c-48e0-8c28-7eb7a7c1b722-kube-api-access-rwgwb\") pod \"cert-manager-webhook-6888856db4-skhrl\" (UID: \"4587c778-c12c-48e0-8c28-7eb7a7c1b722\") " pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:53 crc kubenswrapper[4860]: I0320 11:12:53.055790 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4587c778-c12c-48e0-8c28-7eb7a7c1b722-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-skhrl\" (UID: \"4587c778-c12c-48e0-8c28-7eb7a7c1b722\") " pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:53 crc kubenswrapper[4860]: I0320 11:12:53.056121 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwgwb\" (UniqueName: \"kubernetes.io/projected/4587c778-c12c-48e0-8c28-7eb7a7c1b722-kube-api-access-rwgwb\") pod \"cert-manager-webhook-6888856db4-skhrl\" (UID: \"4587c778-c12c-48e0-8c28-7eb7a7c1b722\") " pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:53 crc kubenswrapper[4860]: I0320 11:12:53.070839 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:12:53 crc kubenswrapper[4860]: I0320 11:12:53.814912 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-skhrl"] Mar 20 11:12:53 crc kubenswrapper[4860]: I0320 11:12:53.949833 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" event={"ID":"4587c778-c12c-48e0-8c28-7eb7a7c1b722","Type":"ContainerStarted","Data":"8947d1ea38c8e3caa32ce4b6e72203473c07b19f5cb48d1e6d89ceeff94d1432"} Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.095060 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-sjz7s"] Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.097136 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.099205 4860 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-96spt" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.111017 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-sjz7s"] Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.185533 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chxtm\" (UniqueName: \"kubernetes.io/projected/aa7d8eaa-20ac-4ea3-b19d-e8f89054c619-kube-api-access-chxtm\") pod \"cert-manager-cainjector-5545bd876-sjz7s\" (UID: \"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.185700 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa7d8eaa-20ac-4ea3-b19d-e8f89054c619-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-sjz7s\" (UID: \"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.289296 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa7d8eaa-20ac-4ea3-b19d-e8f89054c619-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-sjz7s\" (UID: \"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.289423 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chxtm\" (UniqueName: \"kubernetes.io/projected/aa7d8eaa-20ac-4ea3-b19d-e8f89054c619-kube-api-access-chxtm\") pod \"cert-manager-cainjector-5545bd876-sjz7s\" (UID: \"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.312299 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa7d8eaa-20ac-4ea3-b19d-e8f89054c619-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-sjz7s\" (UID: \"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.316481 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chxtm\" (UniqueName: \"kubernetes.io/projected/aa7d8eaa-20ac-4ea3-b19d-e8f89054c619-kube-api-access-chxtm\") pod \"cert-manager-cainjector-5545bd876-sjz7s\" (UID: \"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:12:57 crc kubenswrapper[4860]: I0320 11:12:57.423645 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" Mar 20 11:13:01 crc kubenswrapper[4860]: I0320 11:13:01.222741 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-sjz7s"] Mar 20 11:13:02 crc kubenswrapper[4860]: I0320 11:13:02.016953 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" event={"ID":"4587c778-c12c-48e0-8c28-7eb7a7c1b722","Type":"ContainerStarted","Data":"1f5b7359ddb2c5466b2f230e8d4d31df30245fc54e64e469a4cc280ec6c1c5a0"} Mar 20 11:13:02 crc kubenswrapper[4860]: I0320 11:13:02.017577 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:13:02 crc kubenswrapper[4860]: I0320 11:13:02.018676 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" event={"ID":"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619","Type":"ContainerStarted","Data":"4a0264fae7cc9a8763d85efc390fbe6ebb3b72c07a1bb87615bb1ce1fa205e23"} Mar 20 11:13:02 crc kubenswrapper[4860]: I0320 11:13:02.018729 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" event={"ID":"aa7d8eaa-20ac-4ea3-b19d-e8f89054c619","Type":"ContainerStarted","Data":"c7661c1d441f146549021fcf912401cd216dbe9fb176214ef73ea97f9d148b5d"} Mar 20 11:13:02 crc kubenswrapper[4860]: I0320 11:13:02.041268 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" podStartSLOduration=2.785166915 podStartE2EDuration="10.041243074s" podCreationTimestamp="2026-03-20 11:12:52 +0000 UTC" firstStartedPulling="2026-03-20 11:12:53.825494805 +0000 UTC m=+1098.046855703" lastFinishedPulling="2026-03-20 11:13:01.081570964 +0000 UTC m=+1105.302931862" observedRunningTime="2026-03-20 11:13:02.03556865 +0000 UTC m=+1106.256929548" watchObservedRunningTime="2026-03-20 11:13:02.041243074 +0000 UTC m=+1106.262603972" Mar 20 11:13:02 crc kubenswrapper[4860]: I0320 11:13:02.056396 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-sjz7s" podStartSLOduration=5.056375593 podStartE2EDuration="5.056375593s" podCreationTimestamp="2026-03-20 11:12:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:13:02.052829957 +0000 UTC m=+1106.274190855" watchObservedRunningTime="2026-03-20 11:13:02.056375593 +0000 UTC m=+1106.277736481" Mar 20 11:13:08 crc kubenswrapper[4860]: I0320 11:13:08.075349 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-skhrl" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.466670 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-x6lwp"] Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.468678 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.478040 4860 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-wq9fk" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.479576 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-x6lwp"] Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.582532 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74ed19c1-0e46-4fed-b50f-155eaa38aed9-bound-sa-token\") pod \"cert-manager-545d4d4674-x6lwp\" (UID: \"74ed19c1-0e46-4fed-b50f-155eaa38aed9\") " pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.582613 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55wh9\" (UniqueName: \"kubernetes.io/projected/74ed19c1-0e46-4fed-b50f-155eaa38aed9-kube-api-access-55wh9\") pod \"cert-manager-545d4d4674-x6lwp\" (UID: \"74ed19c1-0e46-4fed-b50f-155eaa38aed9\") " pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.684190 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55wh9\" (UniqueName: \"kubernetes.io/projected/74ed19c1-0e46-4fed-b50f-155eaa38aed9-kube-api-access-55wh9\") pod \"cert-manager-545d4d4674-x6lwp\" (UID: \"74ed19c1-0e46-4fed-b50f-155eaa38aed9\") " pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.684261 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74ed19c1-0e46-4fed-b50f-155eaa38aed9-bound-sa-token\") pod \"cert-manager-545d4d4674-x6lwp\" (UID: \"74ed19c1-0e46-4fed-b50f-155eaa38aed9\") " pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.706488 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74ed19c1-0e46-4fed-b50f-155eaa38aed9-bound-sa-token\") pod \"cert-manager-545d4d4674-x6lwp\" (UID: \"74ed19c1-0e46-4fed-b50f-155eaa38aed9\") " pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.706735 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55wh9\" (UniqueName: \"kubernetes.io/projected/74ed19c1-0e46-4fed-b50f-155eaa38aed9-kube-api-access-55wh9\") pod \"cert-manager-545d4d4674-x6lwp\" (UID: \"74ed19c1-0e46-4fed-b50f-155eaa38aed9\") " pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:12 crc kubenswrapper[4860]: I0320 11:13:12.802212 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-x6lwp" Mar 20 11:13:13 crc kubenswrapper[4860]: I0320 11:13:13.189243 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-x6lwp"] Mar 20 11:13:13 crc kubenswrapper[4860]: W0320 11:13:13.195907 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74ed19c1_0e46_4fed_b50f_155eaa38aed9.slice/crio-f33aa1c0aedf3a2d58ce21e8fe745861fa7df5a3a9e98fcf5dfa1bac5ce024fa WatchSource:0}: Error finding container f33aa1c0aedf3a2d58ce21e8fe745861fa7df5a3a9e98fcf5dfa1bac5ce024fa: Status 404 returned error can't find the container with id f33aa1c0aedf3a2d58ce21e8fe745861fa7df5a3a9e98fcf5dfa1bac5ce024fa Mar 20 11:13:14 crc kubenswrapper[4860]: I0320 11:13:14.113255 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-x6lwp" event={"ID":"74ed19c1-0e46-4fed-b50f-155eaa38aed9","Type":"ContainerStarted","Data":"ce901f928ecd236de2e6dd1faf0bd6e2b197f0a6a58684993d4d5230170700fa"} Mar 20 11:13:14 crc kubenswrapper[4860]: I0320 11:13:14.113738 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-x6lwp" event={"ID":"74ed19c1-0e46-4fed-b50f-155eaa38aed9","Type":"ContainerStarted","Data":"f33aa1c0aedf3a2d58ce21e8fe745861fa7df5a3a9e98fcf5dfa1bac5ce024fa"} Mar 20 11:13:14 crc kubenswrapper[4860]: I0320 11:13:14.136702 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-x6lwp" podStartSLOduration=2.136677417 podStartE2EDuration="2.136677417s" podCreationTimestamp="2026-03-20 11:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:13:14.132452753 +0000 UTC m=+1118.353813661" watchObservedRunningTime="2026-03-20 11:13:14.136677417 +0000 UTC m=+1118.358038325" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.178218 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mhn62"] Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.180169 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mhn62" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.185617 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.185727 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-cz6kh" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.190395 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.203914 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mhn62"] Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.328250 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgg5j\" (UniqueName: \"kubernetes.io/projected/d2f28a99-89df-4152-a8d5-ddf3a8f3edcd-kube-api-access-xgg5j\") pod \"openstack-operator-index-mhn62\" (UID: \"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd\") " pod="openstack-operators/openstack-operator-index-mhn62" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.429861 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgg5j\" (UniqueName: \"kubernetes.io/projected/d2f28a99-89df-4152-a8d5-ddf3a8f3edcd-kube-api-access-xgg5j\") pod \"openstack-operator-index-mhn62\" (UID: \"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd\") " pod="openstack-operators/openstack-operator-index-mhn62" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.459458 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgg5j\" (UniqueName: \"kubernetes.io/projected/d2f28a99-89df-4152-a8d5-ddf3a8f3edcd-kube-api-access-xgg5j\") pod \"openstack-operator-index-mhn62\" (UID: \"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd\") " pod="openstack-operators/openstack-operator-index-mhn62" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.502286 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mhn62" Mar 20 11:13:21 crc kubenswrapper[4860]: I0320 11:13:21.988717 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mhn62"] Mar 20 11:13:22 crc kubenswrapper[4860]: I0320 11:13:22.167216 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mhn62" event={"ID":"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd","Type":"ContainerStarted","Data":"e67e25cb07cccf98304aadd710a0423c4f122b21342a65a9b09ad9931a4c05cc"} Mar 20 11:13:24 crc kubenswrapper[4860]: I0320 11:13:24.555509 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mhn62"] Mar 20 11:13:25 crc kubenswrapper[4860]: I0320 11:13:25.175538 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-82h6r"] Mar 20 11:13:25 crc kubenswrapper[4860]: I0320 11:13:25.176616 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:25 crc kubenswrapper[4860]: I0320 11:13:25.184262 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-82h6r"] Mar 20 11:13:25 crc kubenswrapper[4860]: I0320 11:13:25.196732 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnx66\" (UniqueName: \"kubernetes.io/projected/f7193309-39f9-4487-b02b-8e9e4d6a69ff-kube-api-access-dnx66\") pod \"openstack-operator-index-82h6r\" (UID: \"f7193309-39f9-4487-b02b-8e9e4d6a69ff\") " pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:25 crc kubenswrapper[4860]: I0320 11:13:25.297739 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnx66\" (UniqueName: \"kubernetes.io/projected/f7193309-39f9-4487-b02b-8e9e4d6a69ff-kube-api-access-dnx66\") pod \"openstack-operator-index-82h6r\" (UID: \"f7193309-39f9-4487-b02b-8e9e4d6a69ff\") " pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:25 crc kubenswrapper[4860]: I0320 11:13:25.331047 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnx66\" (UniqueName: \"kubernetes.io/projected/f7193309-39f9-4487-b02b-8e9e4d6a69ff-kube-api-access-dnx66\") pod \"openstack-operator-index-82h6r\" (UID: \"f7193309-39f9-4487-b02b-8e9e4d6a69ff\") " pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:25 crc kubenswrapper[4860]: I0320 11:13:25.508173 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.222358 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mhn62" event={"ID":"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd","Type":"ContainerStarted","Data":"b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d"} Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.222589 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-mhn62" podUID="d2f28a99-89df-4152-a8d5-ddf3a8f3edcd" containerName="registry-server" containerID="cri-o://b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d" gracePeriod=2 Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.245193 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mhn62" podStartSLOduration=1.279419508 podStartE2EDuration="5.245156614s" podCreationTimestamp="2026-03-20 11:13:21 +0000 UTC" firstStartedPulling="2026-03-20 11:13:21.999965046 +0000 UTC m=+1126.221325944" lastFinishedPulling="2026-03-20 11:13:25.965702152 +0000 UTC m=+1130.187063050" observedRunningTime="2026-03-20 11:13:26.24351919 +0000 UTC m=+1130.464880088" watchObservedRunningTime="2026-03-20 11:13:26.245156614 +0000 UTC m=+1130.466517512" Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.271380 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-82h6r"] Mar 20 11:13:26 crc kubenswrapper[4860]: W0320 11:13:26.319871 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7193309_39f9_4487_b02b_8e9e4d6a69ff.slice/crio-356c8758287eaad6a8db73dea56a760cc73fae7d56a51c80cc4690e20327fafd WatchSource:0}: Error finding container 356c8758287eaad6a8db73dea56a760cc73fae7d56a51c80cc4690e20327fafd: Status 404 returned error can't find the container with id 356c8758287eaad6a8db73dea56a760cc73fae7d56a51c80cc4690e20327fafd Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.610824 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-mhn62_d2f28a99-89df-4152-a8d5-ddf3a8f3edcd/registry-server/0.log" Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.610924 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mhn62" Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.723256 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgg5j\" (UniqueName: \"kubernetes.io/projected/d2f28a99-89df-4152-a8d5-ddf3a8f3edcd-kube-api-access-xgg5j\") pod \"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd\" (UID: \"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd\") " Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.731138 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2f28a99-89df-4152-a8d5-ddf3a8f3edcd-kube-api-access-xgg5j" (OuterVolumeSpecName: "kube-api-access-xgg5j") pod "d2f28a99-89df-4152-a8d5-ddf3a8f3edcd" (UID: "d2f28a99-89df-4152-a8d5-ddf3a8f3edcd"). InnerVolumeSpecName "kube-api-access-xgg5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:13:26 crc kubenswrapper[4860]: I0320 11:13:26.825050 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgg5j\" (UniqueName: \"kubernetes.io/projected/d2f28a99-89df-4152-a8d5-ddf3a8f3edcd-kube-api-access-xgg5j\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.232385 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-82h6r" event={"ID":"f7193309-39f9-4487-b02b-8e9e4d6a69ff","Type":"ContainerStarted","Data":"d9e773f56fb45b1e2eca225ac2751b5f52b755312b96694cabce8aa90553d8fb"} Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.232497 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-82h6r" event={"ID":"f7193309-39f9-4487-b02b-8e9e4d6a69ff","Type":"ContainerStarted","Data":"356c8758287eaad6a8db73dea56a760cc73fae7d56a51c80cc4690e20327fafd"} Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.234787 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-mhn62_d2f28a99-89df-4152-a8d5-ddf3a8f3edcd/registry-server/0.log" Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.234844 4860 generic.go:334] "Generic (PLEG): container finished" podID="d2f28a99-89df-4152-a8d5-ddf3a8f3edcd" containerID="b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d" exitCode=2 Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.234896 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mhn62" event={"ID":"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd","Type":"ContainerDied","Data":"b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d"} Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.234936 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mhn62" event={"ID":"d2f28a99-89df-4152-a8d5-ddf3a8f3edcd","Type":"ContainerDied","Data":"e67e25cb07cccf98304aadd710a0423c4f122b21342a65a9b09ad9931a4c05cc"} Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.235005 4860 scope.go:117] "RemoveContainer" containerID="b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d" Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.235155 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mhn62" Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.251843 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-82h6r" podStartSLOduration=2.134626385 podStartE2EDuration="2.251821296s" podCreationTimestamp="2026-03-20 11:13:25 +0000 UTC" firstStartedPulling="2026-03-20 11:13:26.32295121 +0000 UTC m=+1130.544312108" lastFinishedPulling="2026-03-20 11:13:26.440146121 +0000 UTC m=+1130.661507019" observedRunningTime="2026-03-20 11:13:27.249966506 +0000 UTC m=+1131.471327404" watchObservedRunningTime="2026-03-20 11:13:27.251821296 +0000 UTC m=+1131.473182194" Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.257921 4860 scope.go:117] "RemoveContainer" containerID="b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d" Mar 20 11:13:27 crc kubenswrapper[4860]: E0320 11:13:27.258593 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d\": container with ID starting with b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d not found: ID does not exist" containerID="b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d" Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.258635 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d"} err="failed to get container status \"b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d\": rpc error: code = NotFound desc = could not find container \"b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d\": container with ID starting with b8bee47d82b7f9f6349bce4e0d028f154ea992dd78bd7c57f6c33abe1825070d not found: ID does not exist" Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.275028 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mhn62"] Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.280031 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-mhn62"] Mar 20 11:13:27 crc kubenswrapper[4860]: I0320 11:13:27.431834 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2f28a99-89df-4152-a8d5-ddf3a8f3edcd" path="/var/lib/kubelet/pods/d2f28a99-89df-4152-a8d5-ddf3a8f3edcd/volumes" Mar 20 11:13:35 crc kubenswrapper[4860]: I0320 11:13:35.508401 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:35 crc kubenswrapper[4860]: I0320 11:13:35.509022 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:35 crc kubenswrapper[4860]: I0320 11:13:35.551102 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:36 crc kubenswrapper[4860]: I0320 11:13:36.330389 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-82h6r" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.002073 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp"] Mar 20 11:13:38 crc kubenswrapper[4860]: E0320 11:13:38.002650 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f28a99-89df-4152-a8d5-ddf3a8f3edcd" containerName="registry-server" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.002664 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f28a99-89df-4152-a8d5-ddf3a8f3edcd" containerName="registry-server" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.002790 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2f28a99-89df-4152-a8d5-ddf3a8f3edcd" containerName="registry-server" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.003656 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.005965 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-d44nt" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.018731 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp"] Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.191210 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-util\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.191570 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk2mj\" (UniqueName: \"kubernetes.io/projected/c6145c03-dfdd-4224-b2b0-6087b1f137d1-kube-api-access-kk2mj\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.191888 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-bundle\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.292641 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-bundle\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.292738 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-util\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.292808 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk2mj\" (UniqueName: \"kubernetes.io/projected/c6145c03-dfdd-4224-b2b0-6087b1f137d1-kube-api-access-kk2mj\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.293354 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-bundle\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.293819 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-util\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.316502 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk2mj\" (UniqueName: \"kubernetes.io/projected/c6145c03-dfdd-4224-b2b0-6087b1f137d1-kube-api-access-kk2mj\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.322756 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:38 crc kubenswrapper[4860]: I0320 11:13:38.879990 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp"] Mar 20 11:13:38 crc kubenswrapper[4860]: W0320 11:13:38.889432 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6145c03_dfdd_4224_b2b0_6087b1f137d1.slice/crio-1a103de90deaa637218ea1c0517843c848b1587548cb07effced6119e8a7d95b WatchSource:0}: Error finding container 1a103de90deaa637218ea1c0517843c848b1587548cb07effced6119e8a7d95b: Status 404 returned error can't find the container with id 1a103de90deaa637218ea1c0517843c848b1587548cb07effced6119e8a7d95b Mar 20 11:13:39 crc kubenswrapper[4860]: I0320 11:13:39.330730 4860 generic.go:334] "Generic (PLEG): container finished" podID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerID="e8129bf6a5c7e4443bc4223ef31160ba924a8902cead5ce3280ed2b69c93300d" exitCode=0 Mar 20 11:13:39 crc kubenswrapper[4860]: I0320 11:13:39.330903 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" event={"ID":"c6145c03-dfdd-4224-b2b0-6087b1f137d1","Type":"ContainerDied","Data":"e8129bf6a5c7e4443bc4223ef31160ba924a8902cead5ce3280ed2b69c93300d"} Mar 20 11:13:39 crc kubenswrapper[4860]: I0320 11:13:39.331362 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" event={"ID":"c6145c03-dfdd-4224-b2b0-6087b1f137d1","Type":"ContainerStarted","Data":"1a103de90deaa637218ea1c0517843c848b1587548cb07effced6119e8a7d95b"} Mar 20 11:13:40 crc kubenswrapper[4860]: I0320 11:13:40.343171 4860 generic.go:334] "Generic (PLEG): container finished" podID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerID="75afa6cb74a1b94e312b3e23dce2a78c99ad6b105874c3714debba4f94c0a1d0" exitCode=0 Mar 20 11:13:40 crc kubenswrapper[4860]: I0320 11:13:40.343259 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" event={"ID":"c6145c03-dfdd-4224-b2b0-6087b1f137d1","Type":"ContainerDied","Data":"75afa6cb74a1b94e312b3e23dce2a78c99ad6b105874c3714debba4f94c0a1d0"} Mar 20 11:13:41 crc kubenswrapper[4860]: I0320 11:13:41.354493 4860 generic.go:334] "Generic (PLEG): container finished" podID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerID="bad9fff9bfd0e2323ffe61ba5a0146235093ea5f58bedbdcab77cef495d5e5dd" exitCode=0 Mar 20 11:13:41 crc kubenswrapper[4860]: I0320 11:13:41.354999 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" event={"ID":"c6145c03-dfdd-4224-b2b0-6087b1f137d1","Type":"ContainerDied","Data":"bad9fff9bfd0e2323ffe61ba5a0146235093ea5f58bedbdcab77cef495d5e5dd"} Mar 20 11:13:42 crc kubenswrapper[4860]: I0320 11:13:42.767666 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:42 crc kubenswrapper[4860]: I0320 11:13:42.967512 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-util\") pod \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " Mar 20 11:13:42 crc kubenswrapper[4860]: I0320 11:13:42.967612 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-bundle\") pod \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " Mar 20 11:13:42 crc kubenswrapper[4860]: I0320 11:13:42.967731 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk2mj\" (UniqueName: \"kubernetes.io/projected/c6145c03-dfdd-4224-b2b0-6087b1f137d1-kube-api-access-kk2mj\") pod \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\" (UID: \"c6145c03-dfdd-4224-b2b0-6087b1f137d1\") " Mar 20 11:13:42 crc kubenswrapper[4860]: I0320 11:13:42.968758 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-bundle" (OuterVolumeSpecName: "bundle") pod "c6145c03-dfdd-4224-b2b0-6087b1f137d1" (UID: "c6145c03-dfdd-4224-b2b0-6087b1f137d1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:13:42 crc kubenswrapper[4860]: I0320 11:13:42.975302 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6145c03-dfdd-4224-b2b0-6087b1f137d1-kube-api-access-kk2mj" (OuterVolumeSpecName: "kube-api-access-kk2mj") pod "c6145c03-dfdd-4224-b2b0-6087b1f137d1" (UID: "c6145c03-dfdd-4224-b2b0-6087b1f137d1"). InnerVolumeSpecName "kube-api-access-kk2mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:13:42 crc kubenswrapper[4860]: I0320 11:13:42.981600 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-util" (OuterVolumeSpecName: "util") pod "c6145c03-dfdd-4224-b2b0-6087b1f137d1" (UID: "c6145c03-dfdd-4224-b2b0-6087b1f137d1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:13:43 crc kubenswrapper[4860]: I0320 11:13:43.069299 4860 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-util\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:43 crc kubenswrapper[4860]: I0320 11:13:43.069379 4860 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6145c03-dfdd-4224-b2b0-6087b1f137d1-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:43 crc kubenswrapper[4860]: I0320 11:13:43.069392 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk2mj\" (UniqueName: \"kubernetes.io/projected/c6145c03-dfdd-4224-b2b0-6087b1f137d1-kube-api-access-kk2mj\") on node \"crc\" DevicePath \"\"" Mar 20 11:13:43 crc kubenswrapper[4860]: I0320 11:13:43.372642 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" event={"ID":"c6145c03-dfdd-4224-b2b0-6087b1f137d1","Type":"ContainerDied","Data":"1a103de90deaa637218ea1c0517843c848b1587548cb07effced6119e8a7d95b"} Mar 20 11:13:43 crc kubenswrapper[4860]: I0320 11:13:43.373199 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a103de90deaa637218ea1c0517843c848b1587548cb07effced6119e8a7d95b" Mar 20 11:13:43 crc kubenswrapper[4860]: I0320 11:13:43.372734 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.617981 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5"] Mar 20 11:13:50 crc kubenswrapper[4860]: E0320 11:13:50.618914 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerName="util" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.618929 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerName="util" Mar 20 11:13:50 crc kubenswrapper[4860]: E0320 11:13:50.618937 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerName="extract" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.618943 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerName="extract" Mar 20 11:13:50 crc kubenswrapper[4860]: E0320 11:13:50.618954 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerName="pull" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.618960 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerName="pull" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.619084 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6145c03-dfdd-4224-b2b0-6087b1f137d1" containerName="extract" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.619693 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.622768 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-mz62q" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.649887 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5"] Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.682168 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b57j\" (UniqueName: \"kubernetes.io/projected/31f3fcff-ca2c-40b5-bdf3-018132ccb63b-kube-api-access-8b57j\") pod \"openstack-operator-controller-init-846ffbb776-dppd5\" (UID: \"31f3fcff-ca2c-40b5-bdf3-018132ccb63b\") " pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.783698 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b57j\" (UniqueName: \"kubernetes.io/projected/31f3fcff-ca2c-40b5-bdf3-018132ccb63b-kube-api-access-8b57j\") pod \"openstack-operator-controller-init-846ffbb776-dppd5\" (UID: \"31f3fcff-ca2c-40b5-bdf3-018132ccb63b\") " pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.807625 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b57j\" (UniqueName: \"kubernetes.io/projected/31f3fcff-ca2c-40b5-bdf3-018132ccb63b-kube-api-access-8b57j\") pod \"openstack-operator-controller-init-846ffbb776-dppd5\" (UID: \"31f3fcff-ca2c-40b5-bdf3-018132ccb63b\") " pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" Mar 20 11:13:50 crc kubenswrapper[4860]: I0320 11:13:50.944450 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" Mar 20 11:13:51 crc kubenswrapper[4860]: I0320 11:13:51.489341 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5"] Mar 20 11:13:52 crc kubenswrapper[4860]: I0320 11:13:52.344488 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:13:52 crc kubenswrapper[4860]: I0320 11:13:52.345070 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:13:52 crc kubenswrapper[4860]: I0320 11:13:52.442119 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" event={"ID":"31f3fcff-ca2c-40b5-bdf3-018132ccb63b","Type":"ContainerStarted","Data":"26ca22ffb488640010b32159f489f1a42073cd419130b6c1debff8c9062f4330"} Mar 20 11:13:57 crc kubenswrapper[4860]: I0320 11:13:57.497789 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" event={"ID":"31f3fcff-ca2c-40b5-bdf3-018132ccb63b","Type":"ContainerStarted","Data":"c688ae61d3d7e01c8c97058657caa3a3046a09fe5010c8fc985b85eda14c5cbf"} Mar 20 11:13:57 crc kubenswrapper[4860]: I0320 11:13:57.499592 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" Mar 20 11:13:57 crc kubenswrapper[4860]: I0320 11:13:57.534119 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" podStartSLOduration=1.726034311 podStartE2EDuration="7.534093882s" podCreationTimestamp="2026-03-20 11:13:50 +0000 UTC" firstStartedPulling="2026-03-20 11:13:51.4939245 +0000 UTC m=+1155.715285398" lastFinishedPulling="2026-03-20 11:13:57.301984071 +0000 UTC m=+1161.523344969" observedRunningTime="2026-03-20 11:13:57.532312544 +0000 UTC m=+1161.753673462" watchObservedRunningTime="2026-03-20 11:13:57.534093882 +0000 UTC m=+1161.755454780" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.137898 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566754-wdxxk"] Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.139307 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-wdxxk" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.150196 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.150196 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.150566 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.151882 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-wdxxk"] Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.238505 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbk9b\" (UniqueName: \"kubernetes.io/projected/a90da115-522c-4858-935f-7d4a7211c8cb-kube-api-access-sbk9b\") pod \"auto-csr-approver-29566754-wdxxk\" (UID: \"a90da115-522c-4858-935f-7d4a7211c8cb\") " pod="openshift-infra/auto-csr-approver-29566754-wdxxk" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.339911 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbk9b\" (UniqueName: \"kubernetes.io/projected/a90da115-522c-4858-935f-7d4a7211c8cb-kube-api-access-sbk9b\") pod \"auto-csr-approver-29566754-wdxxk\" (UID: \"a90da115-522c-4858-935f-7d4a7211c8cb\") " pod="openshift-infra/auto-csr-approver-29566754-wdxxk" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.370557 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbk9b\" (UniqueName: \"kubernetes.io/projected/a90da115-522c-4858-935f-7d4a7211c8cb-kube-api-access-sbk9b\") pod \"auto-csr-approver-29566754-wdxxk\" (UID: \"a90da115-522c-4858-935f-7d4a7211c8cb\") " pod="openshift-infra/auto-csr-approver-29566754-wdxxk" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.477527 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-wdxxk" Mar 20 11:14:00 crc kubenswrapper[4860]: I0320 11:14:00.740797 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-wdxxk"] Mar 20 11:14:00 crc kubenswrapper[4860]: W0320 11:14:00.759437 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda90da115_522c_4858_935f_7d4a7211c8cb.slice/crio-7cec2c7efab9280ea550027c0d637c29d57bbfbd7df67987fc0fb4279adcde9e WatchSource:0}: Error finding container 7cec2c7efab9280ea550027c0d637c29d57bbfbd7df67987fc0fb4279adcde9e: Status 404 returned error can't find the container with id 7cec2c7efab9280ea550027c0d637c29d57bbfbd7df67987fc0fb4279adcde9e Mar 20 11:14:01 crc kubenswrapper[4860]: I0320 11:14:01.527921 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566754-wdxxk" event={"ID":"a90da115-522c-4858-935f-7d4a7211c8cb","Type":"ContainerStarted","Data":"7cec2c7efab9280ea550027c0d637c29d57bbfbd7df67987fc0fb4279adcde9e"} Mar 20 11:14:06 crc kubenswrapper[4860]: I0320 11:14:06.564650 4860 generic.go:334] "Generic (PLEG): container finished" podID="a90da115-522c-4858-935f-7d4a7211c8cb" containerID="f841889007b20caf67f3aa615ee7ba8514f947be4c45bbec766af3b7f7efe1d8" exitCode=0 Mar 20 11:14:06 crc kubenswrapper[4860]: I0320 11:14:06.565120 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566754-wdxxk" event={"ID":"a90da115-522c-4858-935f-7d4a7211c8cb","Type":"ContainerDied","Data":"f841889007b20caf67f3aa615ee7ba8514f947be4c45bbec766af3b7f7efe1d8"} Mar 20 11:14:07 crc kubenswrapper[4860]: I0320 11:14:07.878002 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-wdxxk" Mar 20 11:14:07 crc kubenswrapper[4860]: I0320 11:14:07.956618 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbk9b\" (UniqueName: \"kubernetes.io/projected/a90da115-522c-4858-935f-7d4a7211c8cb-kube-api-access-sbk9b\") pod \"a90da115-522c-4858-935f-7d4a7211c8cb\" (UID: \"a90da115-522c-4858-935f-7d4a7211c8cb\") " Mar 20 11:14:07 crc kubenswrapper[4860]: I0320 11:14:07.964415 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a90da115-522c-4858-935f-7d4a7211c8cb-kube-api-access-sbk9b" (OuterVolumeSpecName: "kube-api-access-sbk9b") pod "a90da115-522c-4858-935f-7d4a7211c8cb" (UID: "a90da115-522c-4858-935f-7d4a7211c8cb"). InnerVolumeSpecName "kube-api-access-sbk9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:14:08 crc kubenswrapper[4860]: I0320 11:14:08.058155 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbk9b\" (UniqueName: \"kubernetes.io/projected/a90da115-522c-4858-935f-7d4a7211c8cb-kube-api-access-sbk9b\") on node \"crc\" DevicePath \"\"" Mar 20 11:14:08 crc kubenswrapper[4860]: I0320 11:14:08.581792 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566754-wdxxk" event={"ID":"a90da115-522c-4858-935f-7d4a7211c8cb","Type":"ContainerDied","Data":"7cec2c7efab9280ea550027c0d637c29d57bbfbd7df67987fc0fb4279adcde9e"} Mar 20 11:14:08 crc kubenswrapper[4860]: I0320 11:14:08.581847 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cec2c7efab9280ea550027c0d637c29d57bbfbd7df67987fc0fb4279adcde9e" Mar 20 11:14:08 crc kubenswrapper[4860]: I0320 11:14:08.581856 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-wdxxk" Mar 20 11:14:08 crc kubenswrapper[4860]: I0320 11:14:08.932736 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-z8vsk"] Mar 20 11:14:08 crc kubenswrapper[4860]: I0320 11:14:08.941136 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-z8vsk"] Mar 20 11:14:09 crc kubenswrapper[4860]: I0320 11:14:09.422914 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d05f5e64-f0ec-45f9-a491-7dde7bdf6538" path="/var/lib/kubelet/pods/d05f5e64-f0ec-45f9-a491-7dde7bdf6538/volumes" Mar 20 11:14:10 crc kubenswrapper[4860]: I0320 11:14:10.948911 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-dppd5" Mar 20 11:14:22 crc kubenswrapper[4860]: I0320 11:14:22.344616 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:14:22 crc kubenswrapper[4860]: I0320 11:14:22.345451 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.354321 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72"] Mar 20 11:14:29 crc kubenswrapper[4860]: E0320 11:14:29.355668 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a90da115-522c-4858-935f-7d4a7211c8cb" containerName="oc" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.355684 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="a90da115-522c-4858-935f-7d4a7211c8cb" containerName="oc" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.355804 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="a90da115-522c-4858-935f-7d4a7211c8cb" containerName="oc" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.356380 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.359823 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-g929m" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.360384 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.365339 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.367771 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-p9jpf" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.375156 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.382820 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.447411 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-2692b"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.448531 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.451284 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-tmw4g" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.460736 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n98jz\" (UniqueName: \"kubernetes.io/projected/178fff2d-699c-4cab-8626-3e30a6bd9ed6-kube-api-access-n98jz\") pod \"cinder-operator-controller-manager-8d58dc466-s2kwq\" (UID: \"178fff2d-699c-4cab-8626-3e30a6bd9ed6\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.460879 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k44b2\" (UniqueName: \"kubernetes.io/projected/8b4d2530-4f67-45e8-9444-bea25fdad6ae-kube-api-access-k44b2\") pod \"barbican-operator-controller-manager-59bc569d95-8dh72\" (UID: \"8b4d2530-4f67-45e8-9444-bea25fdad6ae\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.478671 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.479987 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.489495 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-k9fnc" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.489745 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.490944 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.493859 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-t8v9p" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.503386 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-2692b"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.519757 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.541309 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.556416 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.557623 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.562066 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n98jz\" (UniqueName: \"kubernetes.io/projected/178fff2d-699c-4cab-8626-3e30a6bd9ed6-kube-api-access-n98jz\") pod \"cinder-operator-controller-manager-8d58dc466-s2kwq\" (UID: \"178fff2d-699c-4cab-8626-3e30a6bd9ed6\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.562175 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkfnh\" (UniqueName: \"kubernetes.io/projected/20d35dc6-0fc2-4651-9dcd-855814132a5f-kube-api-access-nkfnh\") pod \"designate-operator-controller-manager-588d4d986b-2692b\" (UID: \"20d35dc6-0fc2-4651-9dcd-855814132a5f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.562271 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n2hk\" (UniqueName: \"kubernetes.io/projected/36138670-7449-4d49-8a23-73b57d10b67f-kube-api-access-8n2hk\") pod \"heat-operator-controller-manager-67dd5f86f5-vw2d9\" (UID: \"36138670-7449-4d49-8a23-73b57d10b67f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.562300 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k44b2\" (UniqueName: \"kubernetes.io/projected/8b4d2530-4f67-45e8-9444-bea25fdad6ae-kube-api-access-k44b2\") pod \"barbican-operator-controller-manager-59bc569d95-8dh72\" (UID: \"8b4d2530-4f67-45e8-9444-bea25fdad6ae\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.563590 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-gqpfd" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.566261 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.593319 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.594988 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.602203 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.608627 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-m7gfz" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.613716 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.627753 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.629769 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.637757 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k44b2\" (UniqueName: \"kubernetes.io/projected/8b4d2530-4f67-45e8-9444-bea25fdad6ae-kube-api-access-k44b2\") pod \"barbican-operator-controller-manager-59bc569d95-8dh72\" (UID: \"8b4d2530-4f67-45e8-9444-bea25fdad6ae\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.638426 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n98jz\" (UniqueName: \"kubernetes.io/projected/178fff2d-699c-4cab-8626-3e30a6bd9ed6-kube-api-access-n98jz\") pod \"cinder-operator-controller-manager-8d58dc466-s2kwq\" (UID: \"178fff2d-699c-4cab-8626-3e30a6bd9ed6\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.639434 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-8t2wr" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.649283 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.650291 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.655402 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-fb2fd" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.660470 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.663567 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkfnh\" (UniqueName: \"kubernetes.io/projected/20d35dc6-0fc2-4651-9dcd-855814132a5f-kube-api-access-nkfnh\") pod \"designate-operator-controller-manager-588d4d986b-2692b\" (UID: \"20d35dc6-0fc2-4651-9dcd-855814132a5f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.663642 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmzc5\" (UniqueName: \"kubernetes.io/projected/c54f27c4-bd61-4bad-bf91-376fee65d219-kube-api-access-pmzc5\") pod \"horizon-operator-controller-manager-8464cc45fb-wfczk\" (UID: \"c54f27c4-bd61-4bad-bf91-376fee65d219\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.663683 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgkkw\" (UniqueName: \"kubernetes.io/projected/5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b-kube-api-access-vgkkw\") pod \"glance-operator-controller-manager-79df6bcc97-zphz9\" (UID: \"5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.663717 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n2hk\" (UniqueName: \"kubernetes.io/projected/36138670-7449-4d49-8a23-73b57d10b67f-kube-api-access-8n2hk\") pod \"heat-operator-controller-manager-67dd5f86f5-vw2d9\" (UID: \"36138670-7449-4d49-8a23-73b57d10b67f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.701337 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.704111 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n2hk\" (UniqueName: \"kubernetes.io/projected/36138670-7449-4d49-8a23-73b57d10b67f-kube-api-access-8n2hk\") pod \"heat-operator-controller-manager-67dd5f86f5-vw2d9\" (UID: \"36138670-7449-4d49-8a23-73b57d10b67f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.739867 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.741188 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.742390 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.744307 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-fkbb7" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.746106 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkfnh\" (UniqueName: \"kubernetes.io/projected/20d35dc6-0fc2-4651-9dcd-855814132a5f-kube-api-access-nkfnh\") pod \"designate-operator-controller-manager-588d4d986b-2692b\" (UID: \"20d35dc6-0fc2-4651-9dcd-855814132a5f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.753516 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.761300 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.765475 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r45xs\" (UniqueName: \"kubernetes.io/projected/acf57205-3b95-48a3-8222-1b57b0b6c54b-kube-api-access-r45xs\") pod \"ironic-operator-controller-manager-6f787dddc9-mc48w\" (UID: \"acf57205-3b95-48a3-8222-1b57b0b6c54b\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.765540 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.765592 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmzc5\" (UniqueName: \"kubernetes.io/projected/c54f27c4-bd61-4bad-bf91-376fee65d219-kube-api-access-pmzc5\") pod \"horizon-operator-controller-manager-8464cc45fb-wfczk\" (UID: \"c54f27c4-bd61-4bad-bf91-376fee65d219\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.765691 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgkkw\" (UniqueName: \"kubernetes.io/projected/5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b-kube-api-access-vgkkw\") pod \"glance-operator-controller-manager-79df6bcc97-zphz9\" (UID: \"5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.765721 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfrn\" (UniqueName: \"kubernetes.io/projected/70703379-8eb2-4f8a-95c8-302b53692a53-kube-api-access-wlfrn\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.765742 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9jc4\" (UniqueName: \"kubernetes.io/projected/fbbe8243-9afb-4fc5-90f1-04d6f0c074ef-kube-api-access-f9jc4\") pod \"keystone-operator-controller-manager-768b96df4c-pq75b\" (UID: \"fbbe8243-9afb-4fc5-90f1-04d6f0c074ef\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.765827 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.854676 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.855032 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmzc5\" (UniqueName: \"kubernetes.io/projected/c54f27c4-bd61-4bad-bf91-376fee65d219-kube-api-access-pmzc5\") pod \"horizon-operator-controller-manager-8464cc45fb-wfczk\" (UID: \"c54f27c4-bd61-4bad-bf91-376fee65d219\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.861376 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948"] Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.863984 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgkkw\" (UniqueName: \"kubernetes.io/projected/5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b-kube-api-access-vgkkw\") pod \"glance-operator-controller-manager-79df6bcc97-zphz9\" (UID: \"5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.961276 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.962503 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.968460 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r45xs\" (UniqueName: \"kubernetes.io/projected/acf57205-3b95-48a3-8222-1b57b0b6c54b-kube-api-access-r45xs\") pod \"ironic-operator-controller-manager-6f787dddc9-mc48w\" (UID: \"acf57205-3b95-48a3-8222-1b57b0b6c54b\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.968537 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.968712 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfrn\" (UniqueName: \"kubernetes.io/projected/70703379-8eb2-4f8a-95c8-302b53692a53-kube-api-access-wlfrn\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.968753 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9jc4\" (UniqueName: \"kubernetes.io/projected/fbbe8243-9afb-4fc5-90f1-04d6f0c074ef-kube-api-access-f9jc4\") pod \"keystone-operator-controller-manager-768b96df4c-pq75b\" (UID: \"fbbe8243-9afb-4fc5-90f1-04d6f0c074ef\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.968875 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlvhs\" (UniqueName: \"kubernetes.io/projected/0fe9b978-da91-4568-9b77-0d5930aca888-kube-api-access-qlvhs\") pod \"manila-operator-controller-manager-55f864c847-pzk5m\" (UID: \"0fe9b978-da91-4568-9b77-0d5930aca888\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" Mar 20 11:14:29 crc kubenswrapper[4860]: E0320 11:14:29.969455 4860 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:29 crc kubenswrapper[4860]: E0320 11:14:29.969526 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert podName:70703379-8eb2-4f8a-95c8-302b53692a53 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:30.469502745 +0000 UTC m=+1194.690863643 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert") pod "infra-operator-controller-manager-669fff9c7c-njzqs" (UID: "70703379-8eb2-4f8a-95c8-302b53692a53") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.969970 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-b46m4" Mar 20 11:14:29 crc kubenswrapper[4860]: I0320 11:14:29.998064 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.001332 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.003173 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.008174 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-mhqhl" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.028475 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfrn\" (UniqueName: \"kubernetes.io/projected/70703379-8eb2-4f8a-95c8-302b53692a53-kube-api-access-wlfrn\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.039250 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.048096 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r45xs\" (UniqueName: \"kubernetes.io/projected/acf57205-3b95-48a3-8222-1b57b0b6c54b-kube-api-access-r45xs\") pod \"ironic-operator-controller-manager-6f787dddc9-mc48w\" (UID: \"acf57205-3b95-48a3-8222-1b57b0b6c54b\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.056365 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.057662 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.062489 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-jxfnd" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.068288 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.070676 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlvhs\" (UniqueName: \"kubernetes.io/projected/0fe9b978-da91-4568-9b77-0d5930aca888-kube-api-access-qlvhs\") pod \"manila-operator-controller-manager-55f864c847-pzk5m\" (UID: \"0fe9b978-da91-4568-9b77-0d5930aca888\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.070722 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99lpj\" (UniqueName: \"kubernetes.io/projected/d7202366-6dc1-45ca-bb9a-74bdd0426c5f-kube-api-access-99lpj\") pod \"mariadb-operator-controller-manager-67ccfc9778-m8948\" (UID: \"d7202366-6dc1-45ca-bb9a-74bdd0426c5f\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.070803 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzznl\" (UniqueName: \"kubernetes.io/projected/6c2530cf-70b4-4a89-acff-086b36773edf-kube-api-access-dzznl\") pod \"nova-operator-controller-manager-5d488d59fb-z8fp5\" (UID: \"6c2530cf-70b4-4a89-acff-086b36773edf\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.070824 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmgx4\" (UniqueName: \"kubernetes.io/projected/29801d0c-963e-4b38-ad2d-8b03d3ade0be-kube-api-access-lmgx4\") pod \"neutron-operator-controller-manager-767865f676-2vsjq\" (UID: \"29801d0c-963e-4b38-ad2d-8b03d3ade0be\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.071070 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.071588 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9jc4\" (UniqueName: \"kubernetes.io/projected/fbbe8243-9afb-4fc5-90f1-04d6f0c074ef-kube-api-access-f9jc4\") pod \"keystone-operator-controller-manager-768b96df4c-pq75b\" (UID: \"fbbe8243-9afb-4fc5-90f1-04d6f0c074ef\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.080413 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-z77q2" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.093550 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.109309 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.109377 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.112560 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.116089 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.117093 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-p7g4n" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.117200 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.118021 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlvhs\" (UniqueName: \"kubernetes.io/projected/0fe9b978-da91-4568-9b77-0d5930aca888-kube-api-access-qlvhs\") pod \"manila-operator-controller-manager-55f864c847-pzk5m\" (UID: \"0fe9b978-da91-4568-9b77-0d5930aca888\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.125050 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.125546 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.129211 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-k6th9" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.153039 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.172376 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99lpj\" (UniqueName: \"kubernetes.io/projected/d7202366-6dc1-45ca-bb9a-74bdd0426c5f-kube-api-access-99lpj\") pod \"mariadb-operator-controller-manager-67ccfc9778-m8948\" (UID: \"d7202366-6dc1-45ca-bb9a-74bdd0426c5f\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.172426 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7pvw\" (UniqueName: \"kubernetes.io/projected/431ab970-7f36-4ace-860c-479faac092a0-kube-api-access-q7pvw\") pod \"octavia-operator-controller-manager-5b9f45d989-tjt52\" (UID: \"431ab970-7f36-4ace-860c-479faac092a0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.172485 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nn2x\" (UniqueName: \"kubernetes.io/projected/c736e6d7-6806-4ef3-a0b3-f1b17ab33037-kube-api-access-4nn2x\") pod \"ovn-operator-controller-manager-884679f54-4nk5c\" (UID: \"c736e6d7-6806-4ef3-a0b3-f1b17ab33037\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.172507 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.172535 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzznl\" (UniqueName: \"kubernetes.io/projected/6c2530cf-70b4-4a89-acff-086b36773edf-kube-api-access-dzznl\") pod \"nova-operator-controller-manager-5d488d59fb-z8fp5\" (UID: \"6c2530cf-70b4-4a89-acff-086b36773edf\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.172555 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmgx4\" (UniqueName: \"kubernetes.io/projected/29801d0c-963e-4b38-ad2d-8b03d3ade0be-kube-api-access-lmgx4\") pod \"neutron-operator-controller-manager-767865f676-2vsjq\" (UID: \"29801d0c-963e-4b38-ad2d-8b03d3ade0be\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.172578 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmqk5\" (UniqueName: \"kubernetes.io/projected/ecf64e38-138d-4ef7-8b17-c09f30358f3e-kube-api-access-pmqk5\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.201310 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.202600 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.221315 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-5pnn4" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.225247 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.230145 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzznl\" (UniqueName: \"kubernetes.io/projected/6c2530cf-70b4-4a89-acff-086b36773edf-kube-api-access-dzznl\") pod \"nova-operator-controller-manager-5d488d59fb-z8fp5\" (UID: \"6c2530cf-70b4-4a89-acff-086b36773edf\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.243300 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmgx4\" (UniqueName: \"kubernetes.io/projected/29801d0c-963e-4b38-ad2d-8b03d3ade0be-kube-api-access-lmgx4\") pod \"neutron-operator-controller-manager-767865f676-2vsjq\" (UID: \"29801d0c-963e-4b38-ad2d-8b03d3ade0be\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.260740 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.276486 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq7gw\" (UniqueName: \"kubernetes.io/projected/7f73053a-86aa-42dc-bcca-ee26a4fda2e5-kube-api-access-hq7gw\") pod \"placement-operator-controller-manager-5784578c99-4tdg4\" (UID: \"7f73053a-86aa-42dc-bcca-ee26a4fda2e5\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.276580 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7pvw\" (UniqueName: \"kubernetes.io/projected/431ab970-7f36-4ace-860c-479faac092a0-kube-api-access-q7pvw\") pod \"octavia-operator-controller-manager-5b9f45d989-tjt52\" (UID: \"431ab970-7f36-4ace-860c-479faac092a0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.276702 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nn2x\" (UniqueName: \"kubernetes.io/projected/c736e6d7-6806-4ef3-a0b3-f1b17ab33037-kube-api-access-4nn2x\") pod \"ovn-operator-controller-manager-884679f54-4nk5c\" (UID: \"c736e6d7-6806-4ef3-a0b3-f1b17ab33037\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.276735 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.276767 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmqk5\" (UniqueName: \"kubernetes.io/projected/ecf64e38-138d-4ef7-8b17-c09f30358f3e-kube-api-access-pmqk5\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:30 crc kubenswrapper[4860]: E0320 11:14:30.277519 4860 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:30 crc kubenswrapper[4860]: E0320 11:14:30.277574 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert podName:ecf64e38-138d-4ef7-8b17-c09f30358f3e nodeName:}" failed. No retries permitted until 2026-03-20 11:14:30.777558421 +0000 UTC m=+1194.998919319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" (UID: "ecf64e38-138d-4ef7-8b17-c09f30358f3e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.296825 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-dvptb"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.311975 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.317152 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99lpj\" (UniqueName: \"kubernetes.io/projected/d7202366-6dc1-45ca-bb9a-74bdd0426c5f-kube-api-access-99lpj\") pod \"mariadb-operator-controller-manager-67ccfc9778-m8948\" (UID: \"d7202366-6dc1-45ca-bb9a-74bdd0426c5f\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.317815 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-hz52j" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.358902 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.361819 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.364080 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmqk5\" (UniqueName: \"kubernetes.io/projected/ecf64e38-138d-4ef7-8b17-c09f30358f3e-kube-api-access-pmqk5\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.364868 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.372487 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7pvw\" (UniqueName: \"kubernetes.io/projected/431ab970-7f36-4ace-860c-479faac092a0-kube-api-access-q7pvw\") pod \"octavia-operator-controller-manager-5b9f45d989-tjt52\" (UID: \"431ab970-7f36-4ace-860c-479faac092a0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.372643 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-dvptb"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.380011 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbkkg\" (UniqueName: \"kubernetes.io/projected/cce5926a-9df6-4915-a94f-02cf2f74fccc-kube-api-access-rbkkg\") pod \"swift-operator-controller-manager-c674c5965-dvptb\" (UID: \"cce5926a-9df6-4915-a94f-02cf2f74fccc\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.380120 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq7gw\" (UniqueName: \"kubernetes.io/projected/7f73053a-86aa-42dc-bcca-ee26a4fda2e5-kube-api-access-hq7gw\") pod \"placement-operator-controller-manager-5784578c99-4tdg4\" (UID: \"7f73053a-86aa-42dc-bcca-ee26a4fda2e5\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.388515 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.400946 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.402032 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.419690 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nn2x\" (UniqueName: \"kubernetes.io/projected/c736e6d7-6806-4ef3-a0b3-f1b17ab33037-kube-api-access-4nn2x\") pod \"ovn-operator-controller-manager-884679f54-4nk5c\" (UID: \"c736e6d7-6806-4ef3-a0b3-f1b17ab33037\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.426539 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.427736 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.433965 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.437637 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.439291 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-75zw2" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.441408 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.449208 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.457040 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-zljps" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.493949 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.495349 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.495527 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbkkg\" (UniqueName: \"kubernetes.io/projected/cce5926a-9df6-4915-a94f-02cf2f74fccc-kube-api-access-rbkkg\") pod \"swift-operator-controller-manager-c674c5965-dvptb\" (UID: \"cce5926a-9df6-4915-a94f-02cf2f74fccc\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" Mar 20 11:14:30 crc kubenswrapper[4860]: E0320 11:14:30.496277 4860 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:30 crc kubenswrapper[4860]: E0320 11:14:30.496431 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert podName:70703379-8eb2-4f8a-95c8-302b53692a53 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:31.496412824 +0000 UTC m=+1195.717773712 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert") pod "infra-operator-controller-manager-669fff9c7c-njzqs" (UID: "70703379-8eb2-4f8a-95c8-302b53692a53") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.497218 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.529047 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq7gw\" (UniqueName: \"kubernetes.io/projected/7f73053a-86aa-42dc-bcca-ee26a4fda2e5-kube-api-access-hq7gw\") pod \"placement-operator-controller-manager-5784578c99-4tdg4\" (UID: \"7f73053a-86aa-42dc-bcca-ee26a4fda2e5\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.537551 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.542093 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.553429 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbkkg\" (UniqueName: \"kubernetes.io/projected/cce5926a-9df6-4915-a94f-02cf2f74fccc-kube-api-access-rbkkg\") pod \"swift-operator-controller-manager-c674c5965-dvptb\" (UID: \"cce5926a-9df6-4915-a94f-02cf2f74fccc\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.556105 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-l9f4c" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.614158 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.617484 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.618898 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dck6p\" (UniqueName: \"kubernetes.io/projected/f329ab6d-5c8c-4ed2-a830-d0a04bb31071-kube-api-access-dck6p\") pod \"test-operator-controller-manager-5c5cb9c4d7-b4zcf\" (UID: \"f329ab6d-5c8c-4ed2-a830-d0a04bb31071\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.619025 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhllj\" (UniqueName: \"kubernetes.io/projected/1723efcf-97d7-4101-a15d-d4776d45d29b-kube-api-access-qhllj\") pod \"watcher-operator-controller-manager-6c4d75f7f9-ncmzn\" (UID: \"1723efcf-97d7-4101-a15d-d4776d45d29b\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.619106 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph84b\" (UniqueName: \"kubernetes.io/projected/b5e881e2-f657-418f-ba87-7074722307a2-kube-api-access-ph84b\") pod \"telemetry-operator-controller-manager-d6b694c5-jd9bn\" (UID: \"b5e881e2-f657-418f-ba87-7074722307a2\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.691173 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.721253 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph84b\" (UniqueName: \"kubernetes.io/projected/b5e881e2-f657-418f-ba87-7074722307a2-kube-api-access-ph84b\") pod \"telemetry-operator-controller-manager-d6b694c5-jd9bn\" (UID: \"b5e881e2-f657-418f-ba87-7074722307a2\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.721358 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dck6p\" (UniqueName: \"kubernetes.io/projected/f329ab6d-5c8c-4ed2-a830-d0a04bb31071-kube-api-access-dck6p\") pod \"test-operator-controller-manager-5c5cb9c4d7-b4zcf\" (UID: \"f329ab6d-5c8c-4ed2-a830-d0a04bb31071\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.721436 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhllj\" (UniqueName: \"kubernetes.io/projected/1723efcf-97d7-4101-a15d-d4776d45d29b-kube-api-access-qhllj\") pod \"watcher-operator-controller-manager-6c4d75f7f9-ncmzn\" (UID: \"1723efcf-97d7-4101-a15d-d4776d45d29b\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.773706 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph84b\" (UniqueName: \"kubernetes.io/projected/b5e881e2-f657-418f-ba87-7074722307a2-kube-api-access-ph84b\") pod \"telemetry-operator-controller-manager-d6b694c5-jd9bn\" (UID: \"b5e881e2-f657-418f-ba87-7074722307a2\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.785000 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.821049 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dck6p\" (UniqueName: \"kubernetes.io/projected/f329ab6d-5c8c-4ed2-a830-d0a04bb31071-kube-api-access-dck6p\") pod \"test-operator-controller-manager-5c5cb9c4d7-b4zcf\" (UID: \"f329ab6d-5c8c-4ed2-a830-d0a04bb31071\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.823360 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:30 crc kubenswrapper[4860]: E0320 11:14:30.823551 4860 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:30 crc kubenswrapper[4860]: E0320 11:14:30.823605 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert podName:ecf64e38-138d-4ef7-8b17-c09f30358f3e nodeName:}" failed. No retries permitted until 2026-03-20 11:14:31.823589188 +0000 UTC m=+1196.044950086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" (UID: "ecf64e38-138d-4ef7-8b17-c09f30358f3e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.839883 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.841128 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.851801 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rz2ll" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.852039 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.853869 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.869303 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhllj\" (UniqueName: \"kubernetes.io/projected/1723efcf-97d7-4101-a15d-d4776d45d29b-kube-api-access-qhllj\") pod \"watcher-operator-controller-manager-6c4d75f7f9-ncmzn\" (UID: \"1723efcf-97d7-4101-a15d-d4776d45d29b\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.878173 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42"] Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.924491 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.924577 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6bb2\" (UniqueName: \"kubernetes.io/projected/84431296-0ca0-425a-8da8-c3ea46b08b29-kube-api-access-g6bb2\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:30 crc kubenswrapper[4860]: I0320 11:14:30.924677 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.026475 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.026999 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.026691 4860 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.027055 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6bb2\" (UniqueName: \"kubernetes.io/projected/84431296-0ca0-425a-8da8-c3ea46b08b29-kube-api-access-g6bb2\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.027107 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:31.527085434 +0000 UTC m=+1195.748446332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "webhook-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.027207 4860 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.027300 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:31.52727961 +0000 UTC m=+1195.748640508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "metrics-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.077147 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6bb2\" (UniqueName: \"kubernetes.io/projected/84431296-0ca0-425a-8da8-c3ea46b08b29-kube-api-access-g6bb2\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.108350 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.174545 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.567197 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.567378 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.567433 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.567626 4860 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.567699 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:32.567676334 +0000 UTC m=+1196.789037232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "webhook-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.568262 4860 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.568301 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:32.56829061 +0000 UTC m=+1196.789651518 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "metrics-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.568377 4860 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.568406 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert podName:70703379-8eb2-4f8a-95c8-302b53692a53 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:33.568396873 +0000 UTC m=+1197.789757771 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert") pod "infra-operator-controller-manager-669fff9c7c-njzqs" (UID: "70703379-8eb2-4f8a-95c8-302b53692a53") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: I0320 11:14:31.875173 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.876719 4860 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:31 crc kubenswrapper[4860]: E0320 11:14:31.877443 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert podName:ecf64e38-138d-4ef7-8b17-c09f30358f3e nodeName:}" failed. No retries permitted until 2026-03-20 11:14:33.877392735 +0000 UTC m=+1198.098753633 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" (UID: "ecf64e38-138d-4ef7-8b17-c09f30358f3e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.101691 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.121299 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.146917 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.200044 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9"] Mar 20 11:14:32 crc kubenswrapper[4860]: W0320 11:14:32.266441 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20d35dc6_0fc2_4651_9dcd_855814132a5f.slice/crio-2fb6847ae3e5e9932ad364315ce61b7497472147fbe6029885f0eaaa5991042b WatchSource:0}: Error finding container 2fb6847ae3e5e9932ad364315ce61b7497472147fbe6029885f0eaaa5991042b: Status 404 returned error can't find the container with id 2fb6847ae3e5e9932ad364315ce61b7497472147fbe6029885f0eaaa5991042b Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.272670 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m"] Mar 20 11:14:32 crc kubenswrapper[4860]: W0320 11:14:32.274098 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36138670_7449_4d49_8a23_73b57d10b67f.slice/crio-653d999fec5003aae812deaa1315681ba24a09182a99ff5606e9a5eb982df48f WatchSource:0}: Error finding container 653d999fec5003aae812deaa1315681ba24a09182a99ff5606e9a5eb982df48f: Status 404 returned error can't find the container with id 653d999fec5003aae812deaa1315681ba24a09182a99ff5606e9a5eb982df48f Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.286548 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-2692b"] Mar 20 11:14:32 crc kubenswrapper[4860]: W0320 11:14:32.302561 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcce5926a_9df6_4915_a94f_02cf2f74fccc.slice/crio-ba3ced7813e4fd7fd87849bcff9024dafc3336c86d4c304ecb54685fadca0e60 WatchSource:0}: Error finding container ba3ced7813e4fd7fd87849bcff9024dafc3336c86d4c304ecb54685fadca0e60: Status 404 returned error can't find the container with id ba3ced7813e4fd7fd87849bcff9024dafc3336c86d4c304ecb54685fadca0e60 Mar 20 11:14:32 crc kubenswrapper[4860]: W0320 11:14:32.304252 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacf57205_3b95_48a3_8222_1b57b0b6c54b.slice/crio-e927bae90201f5a5587c43f8dd59c8f99798df5507e3f73c3d5ab213af1af82e WatchSource:0}: Error finding container e927bae90201f5a5587c43f8dd59c8f99798df5507e3f73c3d5ab213af1af82e: Status 404 returned error can't find the container with id e927bae90201f5a5587c43f8dd59c8f99798df5507e3f73c3d5ab213af1af82e Mar 20 11:14:32 crc kubenswrapper[4860]: W0320 11:14:32.306809 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f73053a_86aa_42dc_bcca_ee26a4fda2e5.slice/crio-37f3ed3334cc6fb0c5672170f44bb421e636f431221af8f9255733a9bade8084 WatchSource:0}: Error finding container 37f3ed3334cc6fb0c5672170f44bb421e636f431221af8f9255733a9bade8084: Status 404 returned error can't find the container with id 37f3ed3334cc6fb0c5672170f44bb421e636f431221af8f9255733a9bade8084 Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.309890 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9"] Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.315858 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hq7gw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-4tdg4_openstack-operators(7f73053a-86aa-42dc-bcca-ee26a4fda2e5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.317146 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" podUID="7f73053a-86aa-42dc-bcca-ee26a4fda2e5" Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.318168 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-dvptb"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.326590 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.335048 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.350281 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.489203 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.517573 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf"] Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.552181 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ph84b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-jd9bn_openstack-operators(b5e881e2-f657-418f-ba87-7074722307a2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.553590 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" podUID="b5e881e2-f657-418f-ba87-7074722307a2" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.558919 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q7pvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-tjt52_openstack-operators(431ab970-7f36-4ace-860c-479faac092a0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.560151 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-99lpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-m8948_openstack-operators(d7202366-6dc1-45ca-bb9a-74bdd0426c5f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.560267 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dzznl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-z8fp5_openstack-operators(6c2530cf-70b4-4a89-acff-086b36773edf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.560357 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" podUID="431ab970-7f36-4ace-860c-479faac092a0" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.561552 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" podUID="d7202366-6dc1-45ca-bb9a-74bdd0426c5f" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.561632 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" podUID="6c2530cf-70b4-4a89-acff-086b36773edf" Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.570785 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.580198 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.587849 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.595608 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.608986 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.609078 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.610356 4860 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.610436 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:34.61041172 +0000 UTC m=+1198.831772618 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "webhook-server-cert" not found Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.610631 4860 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.610745 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:34.610720589 +0000 UTC m=+1198.832081487 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "metrics-server-cert" not found Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.651878 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn"] Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.659029 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f9jc4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-pq75b_openstack-operators(fbbe8243-9afb-4fc5-90f1-04d6f0c074ef): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.659150 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qhllj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-ncmzn_openstack-operators(1723efcf-97d7-4101-a15d-d4776d45d29b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.660353 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" podUID="fbbe8243-9afb-4fc5-90f1-04d6f0c074ef" Mar 20 11:14:32 crc kubenswrapper[4860]: E0320 11:14:32.660452 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" podUID="1723efcf-97d7-4101-a15d-d4776d45d29b" Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.666736 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b"] Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.996691 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" event={"ID":"cce5926a-9df6-4915-a94f-02cf2f74fccc","Type":"ContainerStarted","Data":"ba3ced7813e4fd7fd87849bcff9024dafc3336c86d4c304ecb54685fadca0e60"} Mar 20 11:14:32 crc kubenswrapper[4860]: I0320 11:14:32.998865 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" event={"ID":"fbbe8243-9afb-4fc5-90f1-04d6f0c074ef","Type":"ContainerStarted","Data":"259ac670c13cada7887f8ddd23af8e8f420390512c179c02aa9d62ec42d77c53"} Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.001445 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" podUID="fbbe8243-9afb-4fc5-90f1-04d6f0c074ef" Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.004424 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" event={"ID":"5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b","Type":"ContainerStarted","Data":"57238a829c24483edc7b7f4ba4a3f4af0b1f67a21336c58eb92c49353e2913e0"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.005860 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" event={"ID":"29801d0c-963e-4b38-ad2d-8b03d3ade0be","Type":"ContainerStarted","Data":"d1b61d4bc46f0fc8a3dcdadce15cdaacd80fb80faeea6db2b60ecbb47c06435d"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.008618 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" event={"ID":"b5e881e2-f657-418f-ba87-7074722307a2","Type":"ContainerStarted","Data":"23e2e5a1441c2cc89cc776a76142fcf6619b8f62fd5add6a8a4e9bce0a9588c2"} Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.010812 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" podUID="b5e881e2-f657-418f-ba87-7074722307a2" Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.012663 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" event={"ID":"178fff2d-699c-4cab-8626-3e30a6bd9ed6","Type":"ContainerStarted","Data":"7d1d04346c0229f6683d4f3f87b8356f2be76cf66c4debdd377435df577f13ab"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.017115 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" event={"ID":"c736e6d7-6806-4ef3-a0b3-f1b17ab33037","Type":"ContainerStarted","Data":"855e5b3d56507d09b7fb79e340a42bff1527e6ec7b7c44f650fa33a7e4d296d7"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.033113 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" event={"ID":"acf57205-3b95-48a3-8222-1b57b0b6c54b","Type":"ContainerStarted","Data":"e927bae90201f5a5587c43f8dd59c8f99798df5507e3f73c3d5ab213af1af82e"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.035428 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" event={"ID":"6c2530cf-70b4-4a89-acff-086b36773edf","Type":"ContainerStarted","Data":"f9cb0b7f58056f0f34d7bf252bd1a03483971006e58fb26b7f71b9cbc4cd8a1e"} Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.043604 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" podUID="6c2530cf-70b4-4a89-acff-086b36773edf" Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.051929 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" event={"ID":"431ab970-7f36-4ace-860c-479faac092a0","Type":"ContainerStarted","Data":"1d7e414246e6b5722ed876b7bc9372effb5108268900a8295c5f131fdb837104"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.053866 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" event={"ID":"8b4d2530-4f67-45e8-9444-bea25fdad6ae","Type":"ContainerStarted","Data":"7e09276a073074ff1b26a9e545b83e81d4d87c1adb1d28bbd3bde3edf96ac0b3"} Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.055184 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" podUID="431ab970-7f36-4ace-860c-479faac092a0" Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.056052 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" event={"ID":"20d35dc6-0fc2-4651-9dcd-855814132a5f","Type":"ContainerStarted","Data":"2fb6847ae3e5e9932ad364315ce61b7497472147fbe6029885f0eaaa5991042b"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.057739 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" event={"ID":"36138670-7449-4d49-8a23-73b57d10b67f","Type":"ContainerStarted","Data":"653d999fec5003aae812deaa1315681ba24a09182a99ff5606e9a5eb982df48f"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.059957 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" event={"ID":"0fe9b978-da91-4568-9b77-0d5930aca888","Type":"ContainerStarted","Data":"b10464282f54fb9a13c150e1ad56528f1ef54ca6ca0e893d367b6788861ad279"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.066928 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" event={"ID":"f329ab6d-5c8c-4ed2-a830-d0a04bb31071","Type":"ContainerStarted","Data":"52701df3a58a4615ad9d527ff7cb91a8105f07c2e9cf7e70925c42479dfdfa8a"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.072535 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" event={"ID":"d7202366-6dc1-45ca-bb9a-74bdd0426c5f","Type":"ContainerStarted","Data":"3a7e6c9120211a60f9c0ff50ac9a9f7a16cc3d0c7d24b0931885e87c80cfd819"} Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.075482 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" podUID="d7202366-6dc1-45ca-bb9a-74bdd0426c5f" Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.077371 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" event={"ID":"c54f27c4-bd61-4bad-bf91-376fee65d219","Type":"ContainerStarted","Data":"27e0f4781c13dca181650d9f15d4a4b37f69875aa55ee0c9a9a2b76706b51bfb"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.079602 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" event={"ID":"7f73053a-86aa-42dc-bcca-ee26a4fda2e5","Type":"ContainerStarted","Data":"37f3ed3334cc6fb0c5672170f44bb421e636f431221af8f9255733a9bade8084"} Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.086653 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" event={"ID":"1723efcf-97d7-4101-a15d-d4776d45d29b","Type":"ContainerStarted","Data":"42af6d544fb4a25beae254e15fa4f1f8fffbfa86ae5c09d56b99174d27dafbb1"} Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.087620 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" podUID="7f73053a-86aa-42dc-bcca-ee26a4fda2e5" Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.088440 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" podUID="1723efcf-97d7-4101-a15d-d4776d45d29b" Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.629362 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.630095 4860 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.630154 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert podName:70703379-8eb2-4f8a-95c8-302b53692a53 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:37.630138365 +0000 UTC m=+1201.851499263 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert") pod "infra-operator-controller-manager-669fff9c7c-njzqs" (UID: "70703379-8eb2-4f8a-95c8-302b53692a53") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:33 crc kubenswrapper[4860]: I0320 11:14:33.952811 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.953115 4860 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:33 crc kubenswrapper[4860]: E0320 11:14:33.953181 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert podName:ecf64e38-138d-4ef7-8b17-c09f30358f3e nodeName:}" failed. No retries permitted until 2026-03-20 11:14:37.953163567 +0000 UTC m=+1202.174524465 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" (UID: "ecf64e38-138d-4ef7-8b17-c09f30358f3e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.104801 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" podUID="b5e881e2-f657-418f-ba87-7074722307a2" Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.104798 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" podUID="1723efcf-97d7-4101-a15d-d4776d45d29b" Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.105431 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" podUID="6c2530cf-70b4-4a89-acff-086b36773edf" Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.105505 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" podUID="7f73053a-86aa-42dc-bcca-ee26a4fda2e5" Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.105550 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" podUID="fbbe8243-9afb-4fc5-90f1-04d6f0c074ef" Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.108428 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" podUID="d7202366-6dc1-45ca-bb9a-74bdd0426c5f" Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.108625 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" podUID="431ab970-7f36-4ace-860c-479faac092a0" Mar 20 11:14:34 crc kubenswrapper[4860]: I0320 11:14:34.668339 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:34 crc kubenswrapper[4860]: I0320 11:14:34.668421 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.668626 4860 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.668699 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:38.668674449 +0000 UTC m=+1202.890035347 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "metrics-server-cert" not found Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.669179 4860 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:14:34 crc kubenswrapper[4860]: E0320 11:14:34.669213 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:38.669202664 +0000 UTC m=+1202.890563562 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "webhook-server-cert" not found Mar 20 11:14:37 crc kubenswrapper[4860]: I0320 11:14:37.656523 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:37 crc kubenswrapper[4860]: E0320 11:14:37.657020 4860 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:37 crc kubenswrapper[4860]: E0320 11:14:37.657495 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert podName:70703379-8eb2-4f8a-95c8-302b53692a53 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:45.657339845 +0000 UTC m=+1209.878700743 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert") pod "infra-operator-controller-manager-669fff9c7c-njzqs" (UID: "70703379-8eb2-4f8a-95c8-302b53692a53") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:14:37 crc kubenswrapper[4860]: I0320 11:14:37.961419 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:37 crc kubenswrapper[4860]: E0320 11:14:37.961680 4860 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:37 crc kubenswrapper[4860]: E0320 11:14:37.961803 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert podName:ecf64e38-138d-4ef7-8b17-c09f30358f3e nodeName:}" failed. No retries permitted until 2026-03-20 11:14:45.961777504 +0000 UTC m=+1210.183138402 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" (UID: "ecf64e38-138d-4ef7-8b17-c09f30358f3e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:38 crc kubenswrapper[4860]: I0320 11:14:38.672281 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:38 crc kubenswrapper[4860]: I0320 11:14:38.672369 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:38 crc kubenswrapper[4860]: E0320 11:14:38.672529 4860 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:14:38 crc kubenswrapper[4860]: E0320 11:14:38.672574 4860 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:14:38 crc kubenswrapper[4860]: E0320 11:14:38.672639 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:46.67261315 +0000 UTC m=+1210.893974048 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "webhook-server-cert" not found Mar 20 11:14:38 crc kubenswrapper[4860]: E0320 11:14:38.672664 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:14:46.672653281 +0000 UTC m=+1210.894014269 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "metrics-server-cert" not found Mar 20 11:14:41 crc kubenswrapper[4860]: I0320 11:14:41.524200 4860 scope.go:117] "RemoveContainer" containerID="478ee16ae7828909784a1f93be49bfc8c3fee1419599f3474cd82711371e05b3" Mar 20 11:14:44 crc kubenswrapper[4860]: E0320 11:14:44.745178 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d" Mar 20 11:14:44 crc kubenswrapper[4860]: E0320 11:14:44.745408 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vgkkw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-79df6bcc97-zphz9_openstack-operators(5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:14:44 crc kubenswrapper[4860]: E0320 11:14:44.746601 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" podUID="5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b" Mar 20 11:14:45 crc kubenswrapper[4860]: E0320 11:14:45.209785 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d\\\"\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" podUID="5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b" Mar 20 11:14:45 crc kubenswrapper[4860]: I0320 11:14:45.744768 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:45 crc kubenswrapper[4860]: I0320 11:14:45.763651 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70703379-8eb2-4f8a-95c8-302b53692a53-cert\") pod \"infra-operator-controller-manager-669fff9c7c-njzqs\" (UID: \"70703379-8eb2-4f8a-95c8-302b53692a53\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:45 crc kubenswrapper[4860]: I0320 11:14:45.881882 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-m7gfz" Mar 20 11:14:45 crc kubenswrapper[4860]: I0320 11:14:45.888907 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:14:46 crc kubenswrapper[4860]: I0320 11:14:46.048113 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.048411 4860 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.048524 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert podName:ecf64e38-138d-4ef7-8b17-c09f30358f3e nodeName:}" failed. No retries permitted until 2026-03-20 11:15:02.048501618 +0000 UTC m=+1226.269862516 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" (UID: "ecf64e38-138d-4ef7-8b17-c09f30358f3e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:14:46 crc kubenswrapper[4860]: I0320 11:14:46.759640 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:46 crc kubenswrapper[4860]: I0320 11:14:46.759864 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.759928 4860 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.760036 4860 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.760054 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:15:02.760022392 +0000 UTC m=+1226.981383300 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "metrics-server-cert" not found Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.760181 4860 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs podName:84431296-0ca0-425a-8da8-c3ea46b08b29 nodeName:}" failed. No retries permitted until 2026-03-20 11:15:02.760152375 +0000 UTC m=+1226.981513423 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-hpk42" (UID: "84431296-0ca0-425a-8da8-c3ea46b08b29") : secret "webhook-server-cert" not found Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.926513 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113" Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.926734 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pmzc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-8464cc45fb-wfczk_openstack-operators(c54f27c4-bd61-4bad-bf91-376fee65d219): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:14:46 crc kubenswrapper[4860]: E0320 11:14:46.927927 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" podUID="c54f27c4-bd61-4bad-bf91-376fee65d219" Mar 20 11:14:47 crc kubenswrapper[4860]: E0320 11:14:47.226446 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" podUID="c54f27c4-bd61-4bad-bf91-376fee65d219" Mar 20 11:14:47 crc kubenswrapper[4860]: E0320 11:14:47.728620 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad" Mar 20 11:14:47 crc kubenswrapper[4860]: E0320 11:14:47.728834 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nkfnh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-588d4d986b-2692b_openstack-operators(20d35dc6-0fc2-4651-9dcd-855814132a5f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:14:47 crc kubenswrapper[4860]: E0320 11:14:47.731875 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" podUID="20d35dc6-0fc2-4651-9dcd-855814132a5f" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.247111 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" event={"ID":"acf57205-3b95-48a3-8222-1b57b0b6c54b","Type":"ContainerStarted","Data":"c4b708f6c8bd7e1d8e843e24fefa9075c80eab27f8abcba486675fbbef953e17"} Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.247548 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.250448 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.260767 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" event={"ID":"f329ab6d-5c8c-4ed2-a830-d0a04bb31071","Type":"ContainerStarted","Data":"8774d2ddb9394da732436a583f367fb0559b05e79ed51417ba4b8379d19eccce"} Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.260942 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.263931 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" event={"ID":"8b4d2530-4f67-45e8-9444-bea25fdad6ae","Type":"ContainerStarted","Data":"6053b33dfa6413e5e99e79e183c60e20067618b574ae56cd39f2373a9f9e4256"} Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.264500 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.283745 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.289689 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" podStartSLOduration=3.771780658 podStartE2EDuration="19.289667936s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.306944029 +0000 UTC m=+1196.528304927" lastFinishedPulling="2026-03-20 11:14:47.824831307 +0000 UTC m=+1212.046192205" observedRunningTime="2026-03-20 11:14:48.280887688 +0000 UTC m=+1212.502248586" watchObservedRunningTime="2026-03-20 11:14:48.289667936 +0000 UTC m=+1212.511028834" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.299345 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs"] Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.308840 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" podStartSLOduration=3.002975794 podStartE2EDuration="18.308816674s" podCreationTimestamp="2026-03-20 11:14:30 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.527130247 +0000 UTC m=+1196.748491145" lastFinishedPulling="2026-03-20 11:14:47.832971127 +0000 UTC m=+1212.054332025" observedRunningTime="2026-03-20 11:14:48.303773358 +0000 UTC m=+1212.525134266" watchObservedRunningTime="2026-03-20 11:14:48.308816674 +0000 UTC m=+1212.530177572" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.319526 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" event={"ID":"178fff2d-699c-4cab-8626-3e30a6bd9ed6","Type":"ContainerStarted","Data":"cffbad57ee8dff1bab6dc6744d40cffde1e9ed1ee59d24cab24773ff0852cacd"} Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.319694 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" Mar 20 11:14:48 crc kubenswrapper[4860]: E0320 11:14:48.322321 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad\\\"\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" podUID="20d35dc6-0fc2-4651-9dcd-855814132a5f" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.343277 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" podStartSLOduration=3.728024254 podStartE2EDuration="19.343247836s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.186300624 +0000 UTC m=+1196.407661522" lastFinishedPulling="2026-03-20 11:14:47.801524206 +0000 UTC m=+1212.022885104" observedRunningTime="2026-03-20 11:14:48.332015372 +0000 UTC m=+1212.553376260" watchObservedRunningTime="2026-03-20 11:14:48.343247836 +0000 UTC m=+1212.564608734" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.363943 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" podStartSLOduration=3.857474597 podStartE2EDuration="19.363917755s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.305534471 +0000 UTC m=+1196.526895369" lastFinishedPulling="2026-03-20 11:14:47.811977629 +0000 UTC m=+1212.033338527" observedRunningTime="2026-03-20 11:14:48.362632561 +0000 UTC m=+1212.583993459" watchObservedRunningTime="2026-03-20 11:14:48.363917755 +0000 UTC m=+1212.585278653" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.442144 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" podStartSLOduration=3.800345561 podStartE2EDuration="19.442121492s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.15437111 +0000 UTC m=+1196.375732008" lastFinishedPulling="2026-03-20 11:14:47.796147041 +0000 UTC m=+1212.017507939" observedRunningTime="2026-03-20 11:14:48.439160572 +0000 UTC m=+1212.660521470" watchObservedRunningTime="2026-03-20 11:14:48.442121492 +0000 UTC m=+1212.663482390" Mar 20 11:14:48 crc kubenswrapper[4860]: I0320 11:14:48.493998 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" podStartSLOduration=3.0036312020000002 podStartE2EDuration="18.493962244s" podCreationTimestamp="2026-03-20 11:14:30 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.305765027 +0000 UTC m=+1196.527125915" lastFinishedPulling="2026-03-20 11:14:47.796096069 +0000 UTC m=+1212.017456957" observedRunningTime="2026-03-20 11:14:48.476211044 +0000 UTC m=+1212.697571942" watchObservedRunningTime="2026-03-20 11:14:48.493962244 +0000 UTC m=+1212.715323142" Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.332079 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" event={"ID":"29801d0c-963e-4b38-ad2d-8b03d3ade0be","Type":"ContainerStarted","Data":"6690b66690c5cf1349d47f82f23e3519efd44d7ad7ae0c4d33aab1d25cd9beef"} Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.332489 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.337152 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" event={"ID":"70703379-8eb2-4f8a-95c8-302b53692a53","Type":"ContainerStarted","Data":"3d3f218cca6614ef4443974bf3189504e2f4dad8910e7ade3a11f253b6812c7a"} Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.347681 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" event={"ID":"cce5926a-9df6-4915-a94f-02cf2f74fccc","Type":"ContainerStarted","Data":"f1a8ff33fc67fd46243a472e2c23e9c942bf5c189453732627de51362450ceac"} Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.436239 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.436286 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" event={"ID":"36138670-7449-4d49-8a23-73b57d10b67f","Type":"ContainerStarted","Data":"b8796b669a8fb9a766536a6eb3d810510e2e840db450b7dbb1246df0ffe345db"} Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.436317 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.436336 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" event={"ID":"c736e6d7-6806-4ef3-a0b3-f1b17ab33037","Type":"ContainerStarted","Data":"33069470f44998b1b24652c7acc1a1f10de7a07c9c01a765ab5fba90c657622d"} Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.436353 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" event={"ID":"0fe9b978-da91-4568-9b77-0d5930aca888","Type":"ContainerStarted","Data":"cb97551383587fa53efc65e44c647e6c20c4eefcc6915e80b778c246bb71b8bb"} Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.440094 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" podStartSLOduration=4.721136719 podStartE2EDuration="20.440075508s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.115675773 +0000 UTC m=+1196.337036671" lastFinishedPulling="2026-03-20 11:14:47.834614562 +0000 UTC m=+1212.055975460" observedRunningTime="2026-03-20 11:14:49.366669951 +0000 UTC m=+1213.588030859" watchObservedRunningTime="2026-03-20 11:14:49.440075508 +0000 UTC m=+1213.661436406" Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.449954 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" podStartSLOduration=4.92925423 podStartE2EDuration="20.449933544s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.279205238 +0000 UTC m=+1196.500566136" lastFinishedPulling="2026-03-20 11:14:47.799884552 +0000 UTC m=+1212.021245450" observedRunningTime="2026-03-20 11:14:49.439360618 +0000 UTC m=+1213.660721526" watchObservedRunningTime="2026-03-20 11:14:49.449933544 +0000 UTC m=+1213.671294442" Mar 20 11:14:49 crc kubenswrapper[4860]: I0320 11:14:49.471289 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" podStartSLOduration=5.159443949 podStartE2EDuration="20.471256951s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.522633185 +0000 UTC m=+1196.743994083" lastFinishedPulling="2026-03-20 11:14:47.834446187 +0000 UTC m=+1212.055807085" observedRunningTime="2026-03-20 11:14:49.464837848 +0000 UTC m=+1213.686198766" watchObservedRunningTime="2026-03-20 11:14:49.471256951 +0000 UTC m=+1213.692617849" Mar 20 11:14:52 crc kubenswrapper[4860]: I0320 11:14:52.344965 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:14:52 crc kubenswrapper[4860]: I0320 11:14:52.345568 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:14:52 crc kubenswrapper[4860]: I0320 11:14:52.345633 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:14:52 crc kubenswrapper[4860]: I0320 11:14:52.346473 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88d9c34d042546706542bef7e7c591b52fa1f874953861e5601a8c6aea607a26"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:14:52 crc kubenswrapper[4860]: I0320 11:14:52.346540 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://88d9c34d042546706542bef7e7c591b52fa1f874953861e5601a8c6aea607a26" gracePeriod=600 Mar 20 11:14:53 crc kubenswrapper[4860]: I0320 11:14:53.480096 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="88d9c34d042546706542bef7e7c591b52fa1f874953861e5601a8c6aea607a26" exitCode=0 Mar 20 11:14:53 crc kubenswrapper[4860]: I0320 11:14:53.480142 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"88d9c34d042546706542bef7e7c591b52fa1f874953861e5601a8c6aea607a26"} Mar 20 11:14:53 crc kubenswrapper[4860]: I0320 11:14:53.480724 4860 scope.go:117] "RemoveContainer" containerID="8a778954d1374877671fc9bca86b7581cc1911487a943aba7bf61952dec5e818" Mar 20 11:14:59 crc kubenswrapper[4860]: I0320 11:14:59.747502 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8dh72" Mar 20 11:14:59 crc kubenswrapper[4860]: I0320 11:14:59.759293 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-s2kwq" Mar 20 11:14:59 crc kubenswrapper[4860]: I0320 11:14:59.857659 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-vw2d9" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.154786 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm"] Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.156360 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.161258 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.161326 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.169366 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm"] Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.263191 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pzk5m" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.289632 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20271235-6d5c-451f-a889-725d0b95503e-config-volume\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.289793 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnfm2\" (UniqueName: \"kubernetes.io/projected/20271235-6d5c-451f-a889-725d0b95503e-kube-api-access-cnfm2\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.289840 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20271235-6d5c-451f-a889-725d0b95503e-secret-volume\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.363132 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-2vsjq" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.369280 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mc48w" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.392050 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20271235-6d5c-451f-a889-725d0b95503e-secret-volume\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.392190 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20271235-6d5c-451f-a889-725d0b95503e-config-volume\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.392346 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnfm2\" (UniqueName: \"kubernetes.io/projected/20271235-6d5c-451f-a889-725d0b95503e-kube-api-access-cnfm2\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.396786 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20271235-6d5c-451f-a889-725d0b95503e-config-volume\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.405433 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20271235-6d5c-451f-a889-725d0b95503e-secret-volume\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.415060 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnfm2\" (UniqueName: \"kubernetes.io/projected/20271235-6d5c-451f-a889-725d0b95503e-kube-api-access-cnfm2\") pod \"collect-profiles-29566755-bj9gm\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.489816 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.502097 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-4nk5c" Mar 20 11:15:00 crc kubenswrapper[4860]: I0320 11:15:00.695719 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-dvptb" Mar 20 11:15:01 crc kubenswrapper[4860]: I0320 11:15:01.112131 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b4zcf" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.120092 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.136345 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecf64e38-138d-4ef7-8b17-c09f30358f3e-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-5tqgx\" (UID: \"ecf64e38-138d-4ef7-8b17-c09f30358f3e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:15:02 crc kubenswrapper[4860]: E0320 11:15:02.166882 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1" Mar 20 11:15:02 crc kubenswrapper[4860]: E0320 11:15:02.167142 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-99lpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-m8948_openstack-operators(d7202366-6dc1-45ca-bb9a-74bdd0426c5f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:15:02 crc kubenswrapper[4860]: E0320 11:15:02.168422 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" podUID="d7202366-6dc1-45ca-bb9a-74bdd0426c5f" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.332878 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-k6th9" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.340993 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.831508 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.832412 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.839974 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.844988 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84431296-0ca0-425a-8da8-c3ea46b08b29-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-hpk42\" (UID: \"84431296-0ca0-425a-8da8-c3ea46b08b29\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.856955 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rz2ll" Mar 20 11:15:02 crc kubenswrapper[4860]: I0320 11:15:02.864535 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.395357 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm"] Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.486183 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx"] Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.524587 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42"] Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.616650 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" event={"ID":"70703379-8eb2-4f8a-95c8-302b53692a53","Type":"ContainerStarted","Data":"c0d6274ac4037eb1f88e930a94b737d0a2f092be4b1c7367386f1ceb6cc01ced"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.616824 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.618063 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" event={"ID":"1723efcf-97d7-4101-a15d-d4776d45d29b","Type":"ContainerStarted","Data":"44ec7ea3c410da47fcddc13a213fd5b93220b87f7aca26a891c5672788b46187"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.619011 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" Mar 20 11:15:04 crc kubenswrapper[4860]: W0320 11:15:04.619777 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84431296_0ca0_425a_8da8_c3ea46b08b29.slice/crio-97cfe89104b2ede051095b525c6b537319ce860f5e50deffd73275cff40a8b5d WatchSource:0}: Error finding container 97cfe89104b2ede051095b525c6b537319ce860f5e50deffd73275cff40a8b5d: Status 404 returned error can't find the container with id 97cfe89104b2ede051095b525c6b537319ce860f5e50deffd73275cff40a8b5d Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.623979 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" event={"ID":"b5e881e2-f657-418f-ba87-7074722307a2","Type":"ContainerStarted","Data":"561d50dea49b7501b2c0729c0aad9aa1605b8f1c2182c02db2a03fad6dc3dbd2"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.624976 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.634855 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" event={"ID":"5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b","Type":"ContainerStarted","Data":"cda8cceec74b6fb38d0cd7ae8613b4b2d45dde15d0a48b166b8c01053c6539f1"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.635200 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.642264 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" event={"ID":"fbbe8243-9afb-4fc5-90f1-04d6f0c074ef","Type":"ContainerStarted","Data":"ba26b0323463584a603cb064561889cf3fb5b72612fc2681710c544b1496e206"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.643364 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.652301 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" event={"ID":"c54f27c4-bd61-4bad-bf91-376fee65d219","Type":"ContainerStarted","Data":"a224617de695744e3935046ee8fa401c32b3ae077622c9c12a3420884f019f7c"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.652719 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.658088 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" podStartSLOduration=20.479190485 podStartE2EDuration="35.65805975s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:48.324477078 +0000 UTC m=+1212.545837976" lastFinishedPulling="2026-03-20 11:15:03.503346343 +0000 UTC m=+1227.724707241" observedRunningTime="2026-03-20 11:15:04.646397514 +0000 UTC m=+1228.867758422" watchObservedRunningTime="2026-03-20 11:15:04.65805975 +0000 UTC m=+1228.879420648" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.690791 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"30e51fcb3abe382780764ee7923f09110290ea7a894489479cd1fa264f3e332e"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.695550 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" podStartSLOduration=3.744695935 podStartE2EDuration="34.695525464s" podCreationTimestamp="2026-03-20 11:14:30 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.551978219 +0000 UTC m=+1196.773339117" lastFinishedPulling="2026-03-20 11:15:03.502807738 +0000 UTC m=+1227.724168646" observedRunningTime="2026-03-20 11:15:04.691814133 +0000 UTC m=+1228.913175031" watchObservedRunningTime="2026-03-20 11:15:04.695525464 +0000 UTC m=+1228.916886362" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.707400 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" event={"ID":"6c2530cf-70b4-4a89-acff-086b36773edf","Type":"ContainerStarted","Data":"d2991549d097e0f8e60163dd8347ebd592f21a708b6bf45559157b0629e5b224"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.708376 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.712780 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" event={"ID":"431ab970-7f36-4ace-860c-479faac092a0","Type":"ContainerStarted","Data":"a7267c3f93af9a30407e1ba678b7a89c0f8d15d671aa4a7d18f3ff4922e4d82e"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.713652 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.714958 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" event={"ID":"20271235-6d5c-451f-a889-725d0b95503e","Type":"ContainerStarted","Data":"1d3b45bad39f776b6c84ee93cf3f652011c9061faef9e35f45a594b16cc6b8aa"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.722131 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" event={"ID":"7f73053a-86aa-42dc-bcca-ee26a4fda2e5","Type":"ContainerStarted","Data":"d85ed8e6e34fbe2b90820068da762f81b3aa62920cd63e30346924dcf18ff7a4"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.724473 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.731097 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" event={"ID":"ecf64e38-138d-4ef7-8b17-c09f30358f3e","Type":"ContainerStarted","Data":"0a29df98ab6deaab732d4563afe2030291420ed0b8c860be68c2c0bbe6b53dfd"} Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.745632 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" podStartSLOduration=4.423815212 podStartE2EDuration="35.745599329s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.296824015 +0000 UTC m=+1196.518184913" lastFinishedPulling="2026-03-20 11:15:03.618608122 +0000 UTC m=+1227.839969030" observedRunningTime="2026-03-20 11:15:04.728128096 +0000 UTC m=+1228.949488994" watchObservedRunningTime="2026-03-20 11:15:04.745599329 +0000 UTC m=+1228.966960227" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.795583 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" podStartSLOduration=4.807159155 podStartE2EDuration="35.795553801s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.658135522 +0000 UTC m=+1196.879496420" lastFinishedPulling="2026-03-20 11:15:03.646530168 +0000 UTC m=+1227.867891066" observedRunningTime="2026-03-20 11:15:04.749616417 +0000 UTC m=+1228.970977315" watchObservedRunningTime="2026-03-20 11:15:04.795553801 +0000 UTC m=+1229.016914699" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.840198 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" podStartSLOduration=3.993839366 podStartE2EDuration="34.840174798s" podCreationTimestamp="2026-03-20 11:14:30 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.658957094 +0000 UTC m=+1196.880317992" lastFinishedPulling="2026-03-20 11:15:03.505292526 +0000 UTC m=+1227.726653424" observedRunningTime="2026-03-20 11:15:04.814414001 +0000 UTC m=+1229.035774899" watchObservedRunningTime="2026-03-20 11:15:04.840174798 +0000 UTC m=+1229.061535696" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.849578 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" podStartSLOduration=4.905412624 podStartE2EDuration="35.849557312s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.558622529 +0000 UTC m=+1196.779983437" lastFinishedPulling="2026-03-20 11:15:03.502767217 +0000 UTC m=+1227.724128125" observedRunningTime="2026-03-20 11:15:04.838556244 +0000 UTC m=+1229.059917142" watchObservedRunningTime="2026-03-20 11:15:04.849557312 +0000 UTC m=+1229.070918210" Mar 20 11:15:04 crc kubenswrapper[4860]: I0320 11:15:04.884488 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" podStartSLOduration=4.69685691 podStartE2EDuration="35.884456816s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.315661745 +0000 UTC m=+1196.537022643" lastFinishedPulling="2026-03-20 11:15:03.503261651 +0000 UTC m=+1227.724622549" observedRunningTime="2026-03-20 11:15:04.871993439 +0000 UTC m=+1229.093354337" watchObservedRunningTime="2026-03-20 11:15:04.884456816 +0000 UTC m=+1229.105817714" Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.038213 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" podStartSLOduration=4.735092375 podStartE2EDuration="36.038181596s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.313844116 +0000 UTC m=+1196.535205014" lastFinishedPulling="2026-03-20 11:15:03.616933337 +0000 UTC m=+1227.838294235" observedRunningTime="2026-03-20 11:15:04.976693502 +0000 UTC m=+1229.198054410" watchObservedRunningTime="2026-03-20 11:15:05.038181596 +0000 UTC m=+1229.259542494" Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.047719 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" podStartSLOduration=4.989271734 podStartE2EDuration="36.047689614s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.560094229 +0000 UTC m=+1196.781455127" lastFinishedPulling="2026-03-20 11:15:03.618512109 +0000 UTC m=+1227.839873007" observedRunningTime="2026-03-20 11:15:05.0216802 +0000 UTC m=+1229.243041098" watchObservedRunningTime="2026-03-20 11:15:05.047689614 +0000 UTC m=+1229.269050512" Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.740095 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" event={"ID":"20d35dc6-0fc2-4651-9dcd-855814132a5f","Type":"ContainerStarted","Data":"ccaba55bb65ff754bdc14a93c8613b20a19800a38989d934340a69d51a04a280"} Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.741677 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.745067 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" event={"ID":"84431296-0ca0-425a-8da8-c3ea46b08b29","Type":"ContainerStarted","Data":"12a84276fddb15a4fc74dbf9612e90fceec66679e5eacfe37bd7258933168d14"} Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.745100 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" event={"ID":"84431296-0ca0-425a-8da8-c3ea46b08b29","Type":"ContainerStarted","Data":"97cfe89104b2ede051095b525c6b537319ce860f5e50deffd73275cff40a8b5d"} Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.745696 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.749936 4860 generic.go:334] "Generic (PLEG): container finished" podID="20271235-6d5c-451f-a889-725d0b95503e" containerID="16689897d8bfeb9aac52fd534664320896c6183e3c8e9d6ce4861a8cab7d6c12" exitCode=0 Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.750678 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" event={"ID":"20271235-6d5c-451f-a889-725d0b95503e","Type":"ContainerDied","Data":"16689897d8bfeb9aac52fd534664320896c6183e3c8e9d6ce4861a8cab7d6c12"} Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.781699 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" podStartSLOduration=5.010222409 podStartE2EDuration="36.781676294s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.275307413 +0000 UTC m=+1196.496668311" lastFinishedPulling="2026-03-20 11:15:04.046761298 +0000 UTC m=+1228.268122196" observedRunningTime="2026-03-20 11:15:05.774564991 +0000 UTC m=+1229.995925889" watchObservedRunningTime="2026-03-20 11:15:05.781676294 +0000 UTC m=+1230.003037192" Mar 20 11:15:05 crc kubenswrapper[4860]: I0320 11:15:05.827107 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" podStartSLOduration=35.827074655 podStartE2EDuration="35.827074655s" podCreationTimestamp="2026-03-20 11:14:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:15:05.822297286 +0000 UTC m=+1230.043658194" watchObservedRunningTime="2026-03-20 11:15:05.827074655 +0000 UTC m=+1230.048435553" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.188291 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.298168 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnfm2\" (UniqueName: \"kubernetes.io/projected/20271235-6d5c-451f-a889-725d0b95503e-kube-api-access-cnfm2\") pod \"20271235-6d5c-451f-a889-725d0b95503e\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.298304 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20271235-6d5c-451f-a889-725d0b95503e-secret-volume\") pod \"20271235-6d5c-451f-a889-725d0b95503e\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.298334 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20271235-6d5c-451f-a889-725d0b95503e-config-volume\") pod \"20271235-6d5c-451f-a889-725d0b95503e\" (UID: \"20271235-6d5c-451f-a889-725d0b95503e\") " Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.299617 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20271235-6d5c-451f-a889-725d0b95503e-config-volume" (OuterVolumeSpecName: "config-volume") pod "20271235-6d5c-451f-a889-725d0b95503e" (UID: "20271235-6d5c-451f-a889-725d0b95503e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.308837 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20271235-6d5c-451f-a889-725d0b95503e-kube-api-access-cnfm2" (OuterVolumeSpecName: "kube-api-access-cnfm2") pod "20271235-6d5c-451f-a889-725d0b95503e" (UID: "20271235-6d5c-451f-a889-725d0b95503e"). InnerVolumeSpecName "kube-api-access-cnfm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.315022 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20271235-6d5c-451f-a889-725d0b95503e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "20271235-6d5c-451f-a889-725d0b95503e" (UID: "20271235-6d5c-451f-a889-725d0b95503e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.400576 4860 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20271235-6d5c-451f-a889-725d0b95503e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.400632 4860 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20271235-6d5c-451f-a889-725d0b95503e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.400647 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnfm2\" (UniqueName: \"kubernetes.io/projected/20271235-6d5c-451f-a889-725d0b95503e-kube-api-access-cnfm2\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.803539 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" event={"ID":"ecf64e38-138d-4ef7-8b17-c09f30358f3e","Type":"ContainerStarted","Data":"a65f23e5c33dc643e187e8f730a3a764838674bd742f1ae12902e714eb89978e"} Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.803702 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.805966 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" event={"ID":"20271235-6d5c-451f-a889-725d0b95503e","Type":"ContainerDied","Data":"1d3b45bad39f776b6c84ee93cf3f652011c9061faef9e35f45a594b16cc6b8aa"} Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.806008 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-bj9gm" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.806024 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d3b45bad39f776b6c84ee93cf3f652011c9061faef9e35f45a594b16cc6b8aa" Mar 20 11:15:08 crc kubenswrapper[4860]: I0320 11:15:08.838215 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" podStartSLOduration=36.250226869 podStartE2EDuration="39.838195337s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:15:04.591621423 +0000 UTC m=+1228.812982321" lastFinishedPulling="2026-03-20 11:15:08.179589891 +0000 UTC m=+1232.400950789" observedRunningTime="2026-03-20 11:15:08.834148397 +0000 UTC m=+1233.055509295" watchObservedRunningTime="2026-03-20 11:15:08.838195337 +0000 UTC m=+1233.059556235" Mar 20 11:15:09 crc kubenswrapper[4860]: I0320 11:15:09.909284 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2692b" Mar 20 11:15:09 crc kubenswrapper[4860]: I0320 11:15:09.966584 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-wfczk" Mar 20 11:15:10 crc kubenswrapper[4860]: I0320 11:15:10.129395 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-zphz9" Mar 20 11:15:10 crc kubenswrapper[4860]: I0320 11:15:10.366119 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pq75b" Mar 20 11:15:10 crc kubenswrapper[4860]: I0320 11:15:10.442344 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z8fp5" Mar 20 11:15:10 crc kubenswrapper[4860]: I0320 11:15:10.452691 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-tjt52" Mar 20 11:15:10 crc kubenswrapper[4860]: I0320 11:15:10.631373 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tdg4" Mar 20 11:15:10 crc kubenswrapper[4860]: I0320 11:15:10.789144 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jd9bn" Mar 20 11:15:11 crc kubenswrapper[4860]: I0320 11:15:11.179287 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-ncmzn" Mar 20 11:15:12 crc kubenswrapper[4860]: I0320 11:15:12.872578 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-hpk42" Mar 20 11:15:13 crc kubenswrapper[4860]: E0320 11:15:13.416131 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" podUID="d7202366-6dc1-45ca-bb9a-74bdd0426c5f" Mar 20 11:15:15 crc kubenswrapper[4860]: I0320 11:15:15.898260 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-njzqs" Mar 20 11:15:22 crc kubenswrapper[4860]: I0320 11:15:22.347859 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-5tqgx" Mar 20 11:15:26 crc kubenswrapper[4860]: I0320 11:15:26.416454 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:15:27 crc kubenswrapper[4860]: I0320 11:15:27.981275 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" event={"ID":"d7202366-6dc1-45ca-bb9a-74bdd0426c5f","Type":"ContainerStarted","Data":"cab15acc2448b50eaeed93fa772423a9f249e78bfcf9d91b29b289883986dd27"} Mar 20 11:15:27 crc kubenswrapper[4860]: I0320 11:15:27.982130 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" Mar 20 11:15:28 crc kubenswrapper[4860]: I0320 11:15:28.003245 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" podStartSLOduration=3.880607048 podStartE2EDuration="59.003191678s" podCreationTimestamp="2026-03-20 11:14:29 +0000 UTC" firstStartedPulling="2026-03-20 11:14:32.559932274 +0000 UTC m=+1196.781293172" lastFinishedPulling="2026-03-20 11:15:27.682516894 +0000 UTC m=+1251.903877802" observedRunningTime="2026-03-20 11:15:27.999474337 +0000 UTC m=+1252.220835235" watchObservedRunningTime="2026-03-20 11:15:28.003191678 +0000 UTC m=+1252.224552576" Mar 20 11:15:40 crc kubenswrapper[4860]: I0320 11:15:40.620034 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8948" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.568565 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dbwww"] Mar 20 11:15:55 crc kubenswrapper[4860]: E0320 11:15:55.569925 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20271235-6d5c-451f-a889-725d0b95503e" containerName="collect-profiles" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.569947 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="20271235-6d5c-451f-a889-725d0b95503e" containerName="collect-profiles" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.570143 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="20271235-6d5c-451f-a889-725d0b95503e" containerName="collect-profiles" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.571172 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.616718 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.616728 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.616917 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-tt45n" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.617023 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.629756 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dbwww"] Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.714780 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sjw98"] Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.716423 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.718862 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.719031 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ac2367-dd61-4085-a756-ab4244a03144-config\") pod \"dnsmasq-dns-675f4bcbfc-dbwww\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.719088 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzt8b\" (UniqueName: \"kubernetes.io/projected/f1ac2367-dd61-4085-a756-ab4244a03144-kube-api-access-dzt8b\") pod \"dnsmasq-dns-675f4bcbfc-dbwww\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.725973 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sjw98"] Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.820705 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccb7e541-f715-4030-8091-91f7e9eacb4c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.820785 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ac2367-dd61-4085-a756-ab4244a03144-config\") pod \"dnsmasq-dns-675f4bcbfc-dbwww\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.820818 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmlqn\" (UniqueName: \"kubernetes.io/projected/ccb7e541-f715-4030-8091-91f7e9eacb4c-kube-api-access-mmlqn\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.820983 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzt8b\" (UniqueName: \"kubernetes.io/projected/f1ac2367-dd61-4085-a756-ab4244a03144-kube-api-access-dzt8b\") pod \"dnsmasq-dns-675f4bcbfc-dbwww\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.821104 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccb7e541-f715-4030-8091-91f7e9eacb4c-config\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.822314 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ac2367-dd61-4085-a756-ab4244a03144-config\") pod \"dnsmasq-dns-675f4bcbfc-dbwww\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.844464 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzt8b\" (UniqueName: \"kubernetes.io/projected/f1ac2367-dd61-4085-a756-ab4244a03144-kube-api-access-dzt8b\") pod \"dnsmasq-dns-675f4bcbfc-dbwww\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.923365 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccb7e541-f715-4030-8091-91f7e9eacb4c-config\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.923518 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccb7e541-f715-4030-8091-91f7e9eacb4c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.923573 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmlqn\" (UniqueName: \"kubernetes.io/projected/ccb7e541-f715-4030-8091-91f7e9eacb4c-kube-api-access-mmlqn\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.924600 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccb7e541-f715-4030-8091-91f7e9eacb4c-config\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.924733 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccb7e541-f715-4030-8091-91f7e9eacb4c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.935955 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:15:55 crc kubenswrapper[4860]: I0320 11:15:55.942618 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmlqn\" (UniqueName: \"kubernetes.io/projected/ccb7e541-f715-4030-8091-91f7e9eacb4c-kube-api-access-mmlqn\") pod \"dnsmasq-dns-78dd6ddcc-sjw98\" (UID: \"ccb7e541-f715-4030-8091-91f7e9eacb4c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:56 crc kubenswrapper[4860]: I0320 11:15:56.156883 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:15:56 crc kubenswrapper[4860]: I0320 11:15:56.789570 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dbwww"] Mar 20 11:15:57 crc kubenswrapper[4860]: I0320 11:15:57.088550 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sjw98"] Mar 20 11:15:57 crc kubenswrapper[4860]: I0320 11:15:57.219883 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" event={"ID":"f1ac2367-dd61-4085-a756-ab4244a03144","Type":"ContainerStarted","Data":"80f7e07847f25b420187a5f84e199c965d08309e8c61a1c3ef33ff79bb484a84"} Mar 20 11:15:57 crc kubenswrapper[4860]: I0320 11:15:57.220950 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" event={"ID":"ccb7e541-f715-4030-8091-91f7e9eacb4c","Type":"ContainerStarted","Data":"6a85cdad59940ceecd7205c5b097a140eb3a8f0947b37763cb9efb1b4fece931"} Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.158673 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566756-77l7w"] Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.161005 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-77l7w" Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.165836 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.168361 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-77l7w"] Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.171619 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.172033 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.245961 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5sn5\" (UniqueName: \"kubernetes.io/projected/638de697-8881-4bb2-b204-2e87655dccbf-kube-api-access-j5sn5\") pod \"auto-csr-approver-29566756-77l7w\" (UID: \"638de697-8881-4bb2-b204-2e87655dccbf\") " pod="openshift-infra/auto-csr-approver-29566756-77l7w" Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.347462 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5sn5\" (UniqueName: \"kubernetes.io/projected/638de697-8881-4bb2-b204-2e87655dccbf-kube-api-access-j5sn5\") pod \"auto-csr-approver-29566756-77l7w\" (UID: \"638de697-8881-4bb2-b204-2e87655dccbf\") " pod="openshift-infra/auto-csr-approver-29566756-77l7w" Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.370860 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5sn5\" (UniqueName: \"kubernetes.io/projected/638de697-8881-4bb2-b204-2e87655dccbf-kube-api-access-j5sn5\") pod \"auto-csr-approver-29566756-77l7w\" (UID: \"638de697-8881-4bb2-b204-2e87655dccbf\") " pod="openshift-infra/auto-csr-approver-29566756-77l7w" Mar 20 11:16:00 crc kubenswrapper[4860]: I0320 11:16:00.496384 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-77l7w" Mar 20 11:16:01 crc kubenswrapper[4860]: I0320 11:16:01.269879 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-77l7w"] Mar 20 11:16:06 crc kubenswrapper[4860]: I0320 11:16:06.389670 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-77l7w" event={"ID":"638de697-8881-4bb2-b204-2e87655dccbf","Type":"ContainerStarted","Data":"e094e59656f19c31a79490192ffde331223a8942624338458c25d40663cb239a"} Mar 20 11:16:19 crc kubenswrapper[4860]: E0320 11:16:19.730700 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 11:16:19 crc kubenswrapper[4860]: E0320 11:16:19.731846 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mmlqn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-sjw98_openstack(ccb7e541-f715-4030-8091-91f7e9eacb4c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:16:19 crc kubenswrapper[4860]: E0320 11:16:19.733053 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" podUID="ccb7e541-f715-4030-8091-91f7e9eacb4c" Mar 20 11:16:19 crc kubenswrapper[4860]: E0320 11:16:19.759044 4860 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 11:16:19 crc kubenswrapper[4860]: E0320 11:16:19.759294 4860 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dzt8b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-dbwww_openstack(f1ac2367-dd61-4085-a756-ab4244a03144): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:16:19 crc kubenswrapper[4860]: E0320 11:16:19.760483 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" Mar 20 11:16:20 crc kubenswrapper[4860]: I0320 11:16:20.658529 4860 generic.go:334] "Generic (PLEG): container finished" podID="638de697-8881-4bb2-b204-2e87655dccbf" containerID="615c37395180628a3c76825ddb15312c7ceadec62513b183ca243bd28c96c9ed" exitCode=0 Mar 20 11:16:20 crc kubenswrapper[4860]: I0320 11:16:20.658584 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-77l7w" event={"ID":"638de697-8881-4bb2-b204-2e87655dccbf","Type":"ContainerDied","Data":"615c37395180628a3c76825ddb15312c7ceadec62513b183ca243bd28c96c9ed"} Mar 20 11:16:20 crc kubenswrapper[4860]: E0320 11:16:20.660729 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" podUID="ccb7e541-f715-4030-8091-91f7e9eacb4c" Mar 20 11:16:20 crc kubenswrapper[4860]: E0320 11:16:20.660778 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" Mar 20 11:16:21 crc kubenswrapper[4860]: I0320 11:16:21.962619 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-77l7w" Mar 20 11:16:22 crc kubenswrapper[4860]: I0320 11:16:22.092737 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5sn5\" (UniqueName: \"kubernetes.io/projected/638de697-8881-4bb2-b204-2e87655dccbf-kube-api-access-j5sn5\") pod \"638de697-8881-4bb2-b204-2e87655dccbf\" (UID: \"638de697-8881-4bb2-b204-2e87655dccbf\") " Mar 20 11:16:22 crc kubenswrapper[4860]: I0320 11:16:22.100530 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/638de697-8881-4bb2-b204-2e87655dccbf-kube-api-access-j5sn5" (OuterVolumeSpecName: "kube-api-access-j5sn5") pod "638de697-8881-4bb2-b204-2e87655dccbf" (UID: "638de697-8881-4bb2-b204-2e87655dccbf"). InnerVolumeSpecName "kube-api-access-j5sn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:16:22 crc kubenswrapper[4860]: I0320 11:16:22.194281 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5sn5\" (UniqueName: \"kubernetes.io/projected/638de697-8881-4bb2-b204-2e87655dccbf-kube-api-access-j5sn5\") on node \"crc\" DevicePath \"\"" Mar 20 11:16:22 crc kubenswrapper[4860]: I0320 11:16:22.677364 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-77l7w" event={"ID":"638de697-8881-4bb2-b204-2e87655dccbf","Type":"ContainerDied","Data":"e094e59656f19c31a79490192ffde331223a8942624338458c25d40663cb239a"} Mar 20 11:16:22 crc kubenswrapper[4860]: I0320 11:16:22.677907 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e094e59656f19c31a79490192ffde331223a8942624338458c25d40663cb239a" Mar 20 11:16:22 crc kubenswrapper[4860]: I0320 11:16:22.677472 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-77l7w" Mar 20 11:16:23 crc kubenswrapper[4860]: I0320 11:16:23.038071 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-bgxvq"] Mar 20 11:16:23 crc kubenswrapper[4860]: I0320 11:16:23.044020 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-bgxvq"] Mar 20 11:16:23 crc kubenswrapper[4860]: I0320 11:16:23.423624 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b45dae17-b8e6-4d57-a525-2892e7ff37f7" path="/var/lib/kubelet/pods/b45dae17-b8e6-4d57-a525-2892e7ff37f7/volumes" Mar 20 11:16:33 crc kubenswrapper[4860]: I0320 11:16:33.770534 4860 generic.go:334] "Generic (PLEG): container finished" podID="ccb7e541-f715-4030-8091-91f7e9eacb4c" containerID="436c166a207b8ff3d6c5eee2e92fa2901918a501a46fa1762e6609f84378e3b4" exitCode=0 Mar 20 11:16:33 crc kubenswrapper[4860]: I0320 11:16:33.770636 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" event={"ID":"ccb7e541-f715-4030-8091-91f7e9eacb4c","Type":"ContainerDied","Data":"436c166a207b8ff3d6c5eee2e92fa2901918a501a46fa1762e6609f84378e3b4"} Mar 20 11:16:34 crc kubenswrapper[4860]: I0320 11:16:34.785621 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" event={"ID":"ccb7e541-f715-4030-8091-91f7e9eacb4c","Type":"ContainerStarted","Data":"4219324ec803228788fd007f3cb36c789afe296d11d52a560d37bcdf324d7f39"} Mar 20 11:16:34 crc kubenswrapper[4860]: I0320 11:16:34.786529 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:16:34 crc kubenswrapper[4860]: I0320 11:16:34.788007 4860 generic.go:334] "Generic (PLEG): container finished" podID="f1ac2367-dd61-4085-a756-ab4244a03144" containerID="b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc" exitCode=0 Mar 20 11:16:34 crc kubenswrapper[4860]: I0320 11:16:34.788048 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" event={"ID":"f1ac2367-dd61-4085-a756-ab4244a03144","Type":"ContainerDied","Data":"b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc"} Mar 20 11:16:34 crc kubenswrapper[4860]: I0320 11:16:34.810548 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" podStartSLOduration=4.025659316 podStartE2EDuration="39.81052246s" podCreationTimestamp="2026-03-20 11:15:55 +0000 UTC" firstStartedPulling="2026-03-20 11:15:57.09613722 +0000 UTC m=+1281.317498118" lastFinishedPulling="2026-03-20 11:16:32.881000364 +0000 UTC m=+1317.102361262" observedRunningTime="2026-03-20 11:16:34.805882464 +0000 UTC m=+1319.027243382" watchObservedRunningTime="2026-03-20 11:16:34.81052246 +0000 UTC m=+1319.031883358" Mar 20 11:16:35 crc kubenswrapper[4860]: I0320 11:16:35.796990 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" event={"ID":"f1ac2367-dd61-4085-a756-ab4244a03144","Type":"ContainerStarted","Data":"e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864"} Mar 20 11:16:35 crc kubenswrapper[4860]: I0320 11:16:35.797843 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:16:35 crc kubenswrapper[4860]: I0320 11:16:35.826538 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" podStartSLOduration=-9223371996.028267 podStartE2EDuration="40.826509217s" podCreationTimestamp="2026-03-20 11:15:55 +0000 UTC" firstStartedPulling="2026-03-20 11:15:56.796302041 +0000 UTC m=+1281.017662939" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:16:35.818640794 +0000 UTC m=+1320.040001722" watchObservedRunningTime="2026-03-20 11:16:35.826509217 +0000 UTC m=+1320.047870105" Mar 20 11:16:40 crc kubenswrapper[4860]: I0320 11:16:40.938546 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:16:41 crc kubenswrapper[4860]: I0320 11:16:41.159039 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78dd6ddcc-sjw98" Mar 20 11:16:41 crc kubenswrapper[4860]: I0320 11:16:41.208577 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dbwww"] Mar 20 11:16:41 crc kubenswrapper[4860]: I0320 11:16:41.862560 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" containerName="dnsmasq-dns" containerID="cri-o://e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864" gracePeriod=10 Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.273686 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.468106 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzt8b\" (UniqueName: \"kubernetes.io/projected/f1ac2367-dd61-4085-a756-ab4244a03144-kube-api-access-dzt8b\") pod \"f1ac2367-dd61-4085-a756-ab4244a03144\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.468733 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ac2367-dd61-4085-a756-ab4244a03144-config\") pod \"f1ac2367-dd61-4085-a756-ab4244a03144\" (UID: \"f1ac2367-dd61-4085-a756-ab4244a03144\") " Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.475665 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ac2367-dd61-4085-a756-ab4244a03144-kube-api-access-dzt8b" (OuterVolumeSpecName: "kube-api-access-dzt8b") pod "f1ac2367-dd61-4085-a756-ab4244a03144" (UID: "f1ac2367-dd61-4085-a756-ab4244a03144"). InnerVolumeSpecName "kube-api-access-dzt8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.505867 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1ac2367-dd61-4085-a756-ab4244a03144-config" (OuterVolumeSpecName: "config") pod "f1ac2367-dd61-4085-a756-ab4244a03144" (UID: "f1ac2367-dd61-4085-a756-ab4244a03144"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.570254 4860 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ac2367-dd61-4085-a756-ab4244a03144-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.570312 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzt8b\" (UniqueName: \"kubernetes.io/projected/f1ac2367-dd61-4085-a756-ab4244a03144-kube-api-access-dzt8b\") on node \"crc\" DevicePath \"\"" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.873020 4860 generic.go:334] "Generic (PLEG): container finished" podID="f1ac2367-dd61-4085-a756-ab4244a03144" containerID="e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864" exitCode=0 Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.873059 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" event={"ID":"f1ac2367-dd61-4085-a756-ab4244a03144","Type":"ContainerDied","Data":"e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864"} Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.873118 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" event={"ID":"f1ac2367-dd61-4085-a756-ab4244a03144","Type":"ContainerDied","Data":"80f7e07847f25b420187a5f84e199c965d08309e8c61a1c3ef33ff79bb484a84"} Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.873134 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dbwww" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.873143 4860 scope.go:117] "RemoveContainer" containerID="e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.908428 4860 scope.go:117] "RemoveContainer" containerID="b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.914460 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dbwww"] Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.921297 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dbwww"] Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.933406 4860 scope.go:117] "RemoveContainer" containerID="e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864" Mar 20 11:16:42 crc kubenswrapper[4860]: E0320 11:16:42.934179 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864\": container with ID starting with e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864 not found: ID does not exist" containerID="e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.934250 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864"} err="failed to get container status \"e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864\": rpc error: code = NotFound desc = could not find container \"e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864\": container with ID starting with e1607f1c44622aa70104e313e3bc627fe6b0f592fbbaece9de80793244209864 not found: ID does not exist" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.934284 4860 scope.go:117] "RemoveContainer" containerID="b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc" Mar 20 11:16:42 crc kubenswrapper[4860]: E0320 11:16:42.934766 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc\": container with ID starting with b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc not found: ID does not exist" containerID="b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc" Mar 20 11:16:42 crc kubenswrapper[4860]: I0320 11:16:42.934795 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc"} err="failed to get container status \"b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc\": rpc error: code = NotFound desc = could not find container \"b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc\": container with ID starting with b5035d8f8bb58d6e0bc5b22aaf08083d6babeae6b9315a1daf4306a6ccee0dcc not found: ID does not exist" Mar 20 11:16:43 crc kubenswrapper[4860]: I0320 11:16:43.423150 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" path="/var/lib/kubelet/pods/f1ac2367-dd61-4085-a756-ab4244a03144/volumes" Mar 20 11:16:47 crc kubenswrapper[4860]: I0320 11:16:47.834388 4860 scope.go:117] "RemoveContainer" containerID="bb236d0c90b35c798ab0b91ca64ed98eb462e09d8cbe538c6779b53064938615" Mar 20 11:17:22 crc kubenswrapper[4860]: I0320 11:17:22.344075 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:17:22 crc kubenswrapper[4860]: I0320 11:17:22.345046 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:17:52 crc kubenswrapper[4860]: I0320 11:17:52.346384 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:17:52 crc kubenswrapper[4860]: I0320 11:17:52.347065 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.153429 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566758-bblpr"] Mar 20 11:18:00 crc kubenswrapper[4860]: E0320 11:18:00.154752 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="638de697-8881-4bb2-b204-2e87655dccbf" containerName="oc" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.154770 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="638de697-8881-4bb2-b204-2e87655dccbf" containerName="oc" Mar 20 11:18:00 crc kubenswrapper[4860]: E0320 11:18:00.154795 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" containerName="init" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.154802 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" containerName="init" Mar 20 11:18:00 crc kubenswrapper[4860]: E0320 11:18:00.154827 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" containerName="dnsmasq-dns" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.154835 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" containerName="dnsmasq-dns" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.155015 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ac2367-dd61-4085-a756-ab4244a03144" containerName="dnsmasq-dns" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.155032 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="638de697-8881-4bb2-b204-2e87655dccbf" containerName="oc" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.155766 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-bblpr" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.159976 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.160575 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.162305 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.167010 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-bblpr"] Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.343398 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs5x5\" (UniqueName: \"kubernetes.io/projected/3d4db42a-d915-4c4d-a985-be77a5381514-kube-api-access-zs5x5\") pod \"auto-csr-approver-29566758-bblpr\" (UID: \"3d4db42a-d915-4c4d-a985-be77a5381514\") " pod="openshift-infra/auto-csr-approver-29566758-bblpr" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.445399 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs5x5\" (UniqueName: \"kubernetes.io/projected/3d4db42a-d915-4c4d-a985-be77a5381514-kube-api-access-zs5x5\") pod \"auto-csr-approver-29566758-bblpr\" (UID: \"3d4db42a-d915-4c4d-a985-be77a5381514\") " pod="openshift-infra/auto-csr-approver-29566758-bblpr" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.475561 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs5x5\" (UniqueName: \"kubernetes.io/projected/3d4db42a-d915-4c4d-a985-be77a5381514-kube-api-access-zs5x5\") pod \"auto-csr-approver-29566758-bblpr\" (UID: \"3d4db42a-d915-4c4d-a985-be77a5381514\") " pod="openshift-infra/auto-csr-approver-29566758-bblpr" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.479265 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-bblpr" Mar 20 11:18:00 crc kubenswrapper[4860]: I0320 11:18:00.925350 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-bblpr"] Mar 20 11:18:01 crc kubenswrapper[4860]: I0320 11:18:01.517646 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-bblpr" event={"ID":"3d4db42a-d915-4c4d-a985-be77a5381514","Type":"ContainerStarted","Data":"77bf8420052e50572c0a65816c72120e464ce959a0d30fadee753600216da879"} Mar 20 11:18:02 crc kubenswrapper[4860]: I0320 11:18:02.527384 4860 generic.go:334] "Generic (PLEG): container finished" podID="3d4db42a-d915-4c4d-a985-be77a5381514" containerID="666fc76c19255af020ada26a1d756d00c6fc27b0113301cab647ad4c35e9ef0c" exitCode=0 Mar 20 11:18:02 crc kubenswrapper[4860]: I0320 11:18:02.527448 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-bblpr" event={"ID":"3d4db42a-d915-4c4d-a985-be77a5381514","Type":"ContainerDied","Data":"666fc76c19255af020ada26a1d756d00c6fc27b0113301cab647ad4c35e9ef0c"} Mar 20 11:18:03 crc kubenswrapper[4860]: I0320 11:18:03.822097 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-bblpr" Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.005283 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs5x5\" (UniqueName: \"kubernetes.io/projected/3d4db42a-d915-4c4d-a985-be77a5381514-kube-api-access-zs5x5\") pod \"3d4db42a-d915-4c4d-a985-be77a5381514\" (UID: \"3d4db42a-d915-4c4d-a985-be77a5381514\") " Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.013605 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d4db42a-d915-4c4d-a985-be77a5381514-kube-api-access-zs5x5" (OuterVolumeSpecName: "kube-api-access-zs5x5") pod "3d4db42a-d915-4c4d-a985-be77a5381514" (UID: "3d4db42a-d915-4c4d-a985-be77a5381514"). InnerVolumeSpecName "kube-api-access-zs5x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.107264 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs5x5\" (UniqueName: \"kubernetes.io/projected/3d4db42a-d915-4c4d-a985-be77a5381514-kube-api-access-zs5x5\") on node \"crc\" DevicePath \"\"" Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.545399 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-bblpr" event={"ID":"3d4db42a-d915-4c4d-a985-be77a5381514","Type":"ContainerDied","Data":"77bf8420052e50572c0a65816c72120e464ce959a0d30fadee753600216da879"} Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.545911 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77bf8420052e50572c0a65816c72120e464ce959a0d30fadee753600216da879" Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.545757 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-bblpr" Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.899980 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-kn2qv"] Mar 20 11:18:04 crc kubenswrapper[4860]: I0320 11:18:04.906430 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-kn2qv"] Mar 20 11:18:05 crc kubenswrapper[4860]: I0320 11:18:05.429012 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdc939b6-92ac-4e00-ae32-b518e4257043" path="/var/lib/kubelet/pods/fdc939b6-92ac-4e00-ae32-b518e4257043/volumes" Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.344780 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.345674 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.345736 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.346707 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"30e51fcb3abe382780764ee7923f09110290ea7a894489479cd1fa264f3e332e"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.346774 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://30e51fcb3abe382780764ee7923f09110290ea7a894489479cd1fa264f3e332e" gracePeriod=600 Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.697663 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="30e51fcb3abe382780764ee7923f09110290ea7a894489479cd1fa264f3e332e" exitCode=0 Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.697739 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"30e51fcb3abe382780764ee7923f09110290ea7a894489479cd1fa264f3e332e"} Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.698196 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4"} Mar 20 11:18:22 crc kubenswrapper[4860]: I0320 11:18:22.698280 4860 scope.go:117] "RemoveContainer" containerID="88d9c34d042546706542bef7e7c591b52fa1f874953861e5601a8c6aea607a26" Mar 20 11:18:47 crc kubenswrapper[4860]: I0320 11:18:47.927973 4860 scope.go:117] "RemoveContainer" containerID="986b40f9c6d66be5183f5bf7b868f2a3962c56f81df4ee2138cb170b1b825e18" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.157208 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566760-gs8qf"] Mar 20 11:20:00 crc kubenswrapper[4860]: E0320 11:20:00.158284 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d4db42a-d915-4c4d-a985-be77a5381514" containerName="oc" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.158304 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d4db42a-d915-4c4d-a985-be77a5381514" containerName="oc" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.158559 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d4db42a-d915-4c4d-a985-be77a5381514" containerName="oc" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.159252 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-gs8qf" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.162713 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.163131 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.167180 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.173783 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-gs8qf"] Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.289042 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhtrs\" (UniqueName: \"kubernetes.io/projected/02d4a854-e21a-46b4-976b-17645af17c8b-kube-api-access-bhtrs\") pod \"auto-csr-approver-29566760-gs8qf\" (UID: \"02d4a854-e21a-46b4-976b-17645af17c8b\") " pod="openshift-infra/auto-csr-approver-29566760-gs8qf" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.391373 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhtrs\" (UniqueName: \"kubernetes.io/projected/02d4a854-e21a-46b4-976b-17645af17c8b-kube-api-access-bhtrs\") pod \"auto-csr-approver-29566760-gs8qf\" (UID: \"02d4a854-e21a-46b4-976b-17645af17c8b\") " pod="openshift-infra/auto-csr-approver-29566760-gs8qf" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.415084 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhtrs\" (UniqueName: \"kubernetes.io/projected/02d4a854-e21a-46b4-976b-17645af17c8b-kube-api-access-bhtrs\") pod \"auto-csr-approver-29566760-gs8qf\" (UID: \"02d4a854-e21a-46b4-976b-17645af17c8b\") " pod="openshift-infra/auto-csr-approver-29566760-gs8qf" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.482948 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-gs8qf" Mar 20 11:20:00 crc kubenswrapper[4860]: I0320 11:20:00.917562 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-gs8qf"] Mar 20 11:20:01 crc kubenswrapper[4860]: I0320 11:20:01.603851 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566760-gs8qf" event={"ID":"02d4a854-e21a-46b4-976b-17645af17c8b","Type":"ContainerStarted","Data":"a85478ba1f3ebe87684633db4f6f4973b86ce2fd1180f8cc06726b92ab0ef1cf"} Mar 20 11:20:02 crc kubenswrapper[4860]: I0320 11:20:02.615577 4860 generic.go:334] "Generic (PLEG): container finished" podID="02d4a854-e21a-46b4-976b-17645af17c8b" containerID="b7ac94cb420b15b471714072b91c8c315997e55e973785d5b6c9d7428acd11e7" exitCode=0 Mar 20 11:20:02 crc kubenswrapper[4860]: I0320 11:20:02.615687 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566760-gs8qf" event={"ID":"02d4a854-e21a-46b4-976b-17645af17c8b","Type":"ContainerDied","Data":"b7ac94cb420b15b471714072b91c8c315997e55e973785d5b6c9d7428acd11e7"} Mar 20 11:20:03 crc kubenswrapper[4860]: I0320 11:20:03.907776 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-gs8qf" Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.048774 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhtrs\" (UniqueName: \"kubernetes.io/projected/02d4a854-e21a-46b4-976b-17645af17c8b-kube-api-access-bhtrs\") pod \"02d4a854-e21a-46b4-976b-17645af17c8b\" (UID: \"02d4a854-e21a-46b4-976b-17645af17c8b\") " Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.055072 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d4a854-e21a-46b4-976b-17645af17c8b-kube-api-access-bhtrs" (OuterVolumeSpecName: "kube-api-access-bhtrs") pod "02d4a854-e21a-46b4-976b-17645af17c8b" (UID: "02d4a854-e21a-46b4-976b-17645af17c8b"). InnerVolumeSpecName "kube-api-access-bhtrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.151069 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhtrs\" (UniqueName: \"kubernetes.io/projected/02d4a854-e21a-46b4-976b-17645af17c8b-kube-api-access-bhtrs\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.634637 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566760-gs8qf" event={"ID":"02d4a854-e21a-46b4-976b-17645af17c8b","Type":"ContainerDied","Data":"a85478ba1f3ebe87684633db4f6f4973b86ce2fd1180f8cc06726b92ab0ef1cf"} Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.634699 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a85478ba1f3ebe87684633db4f6f4973b86ce2fd1180f8cc06726b92ab0ef1cf" Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.634707 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-gs8qf" Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.976539 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-wdxxk"] Mar 20 11:20:04 crc kubenswrapper[4860]: I0320 11:20:04.984181 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-wdxxk"] Mar 20 11:20:05 crc kubenswrapper[4860]: I0320 11:20:05.429515 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a90da115-522c-4858-935f-7d4a7211c8cb" path="/var/lib/kubelet/pods/a90da115-522c-4858-935f-7d4a7211c8cb/volumes" Mar 20 11:20:22 crc kubenswrapper[4860]: I0320 11:20:22.344551 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:20:22 crc kubenswrapper[4860]: I0320 11:20:22.345461 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:20:48 crc kubenswrapper[4860]: I0320 11:20:48.013109 4860 scope.go:117] "RemoveContainer" containerID="f841889007b20caf67f3aa615ee7ba8514f947be4c45bbec766af3b7f7efe1d8" Mar 20 11:20:52 crc kubenswrapper[4860]: I0320 11:20:52.345116 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:20:52 crc kubenswrapper[4860]: I0320 11:20:52.347710 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:21:22 crc kubenswrapper[4860]: I0320 11:21:22.345281 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:21:22 crc kubenswrapper[4860]: I0320 11:21:22.346045 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:21:22 crc kubenswrapper[4860]: I0320 11:21:22.346113 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:21:22 crc kubenswrapper[4860]: I0320 11:21:22.346941 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:21:22 crc kubenswrapper[4860]: I0320 11:21:22.347010 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" gracePeriod=600 Mar 20 11:21:22 crc kubenswrapper[4860]: E0320 11:21:22.471775 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:21:23 crc kubenswrapper[4860]: I0320 11:21:23.285063 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" exitCode=0 Mar 20 11:21:23 crc kubenswrapper[4860]: I0320 11:21:23.285103 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4"} Mar 20 11:21:23 crc kubenswrapper[4860]: I0320 11:21:23.285698 4860 scope.go:117] "RemoveContainer" containerID="30e51fcb3abe382780764ee7923f09110290ea7a894489479cd1fa264f3e332e" Mar 20 11:21:23 crc kubenswrapper[4860]: I0320 11:21:23.286477 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:21:23 crc kubenswrapper[4860]: E0320 11:21:23.286767 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.040628 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9gvjr"] Mar 20 11:21:29 crc kubenswrapper[4860]: E0320 11:21:29.042176 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d4a854-e21a-46b4-976b-17645af17c8b" containerName="oc" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.042199 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d4a854-e21a-46b4-976b-17645af17c8b" containerName="oc" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.042462 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d4a854-e21a-46b4-976b-17645af17c8b" containerName="oc" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.043889 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.061866 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9gvjr"] Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.084853 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-utilities\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.084903 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-catalog-content\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.084956 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5zsx\" (UniqueName: \"kubernetes.io/projected/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-kube-api-access-v5zsx\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.186339 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5zsx\" (UniqueName: \"kubernetes.io/projected/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-kube-api-access-v5zsx\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.186496 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-utilities\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.186530 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-catalog-content\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.187299 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-catalog-content\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.187319 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-utilities\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.208958 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5zsx\" (UniqueName: \"kubernetes.io/projected/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-kube-api-access-v5zsx\") pod \"redhat-operators-9gvjr\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.380216 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:29 crc kubenswrapper[4860]: I0320 11:21:29.853520 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9gvjr"] Mar 20 11:21:30 crc kubenswrapper[4860]: I0320 11:21:30.369255 4860 generic.go:334] "Generic (PLEG): container finished" podID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerID="bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf" exitCode=0 Mar 20 11:21:30 crc kubenswrapper[4860]: I0320 11:21:30.369459 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gvjr" event={"ID":"8fa1c68c-e71f-456c-a53f-1ba28dd3952f","Type":"ContainerDied","Data":"bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf"} Mar 20 11:21:30 crc kubenswrapper[4860]: I0320 11:21:30.369566 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gvjr" event={"ID":"8fa1c68c-e71f-456c-a53f-1ba28dd3952f","Type":"ContainerStarted","Data":"fcc3ad862451d2531d00ef7f4051c1f59243588e81decdce68473cd448e73080"} Mar 20 11:21:30 crc kubenswrapper[4860]: I0320 11:21:30.372296 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:21:32 crc kubenswrapper[4860]: I0320 11:21:32.390007 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gvjr" event={"ID":"8fa1c68c-e71f-456c-a53f-1ba28dd3952f","Type":"ContainerStarted","Data":"a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f"} Mar 20 11:21:33 crc kubenswrapper[4860]: I0320 11:21:33.399407 4860 generic.go:334] "Generic (PLEG): container finished" podID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerID="a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f" exitCode=0 Mar 20 11:21:33 crc kubenswrapper[4860]: I0320 11:21:33.399520 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gvjr" event={"ID":"8fa1c68c-e71f-456c-a53f-1ba28dd3952f","Type":"ContainerDied","Data":"a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f"} Mar 20 11:21:34 crc kubenswrapper[4860]: I0320 11:21:34.410619 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gvjr" event={"ID":"8fa1c68c-e71f-456c-a53f-1ba28dd3952f","Type":"ContainerStarted","Data":"a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd"} Mar 20 11:21:34 crc kubenswrapper[4860]: I0320 11:21:34.413859 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:21:34 crc kubenswrapper[4860]: E0320 11:21:34.414272 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:21:34 crc kubenswrapper[4860]: I0320 11:21:34.437767 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9gvjr" podStartSLOduration=1.902814372 podStartE2EDuration="5.437739146s" podCreationTimestamp="2026-03-20 11:21:29 +0000 UTC" firstStartedPulling="2026-03-20 11:21:30.372072807 +0000 UTC m=+1614.593433695" lastFinishedPulling="2026-03-20 11:21:33.906997571 +0000 UTC m=+1618.128358469" observedRunningTime="2026-03-20 11:21:34.431200008 +0000 UTC m=+1618.652560906" watchObservedRunningTime="2026-03-20 11:21:34.437739146 +0000 UTC m=+1618.659100044" Mar 20 11:21:39 crc kubenswrapper[4860]: I0320 11:21:39.380665 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:39 crc kubenswrapper[4860]: I0320 11:21:39.381090 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:40 crc kubenswrapper[4860]: I0320 11:21:40.430303 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9gvjr" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="registry-server" probeResult="failure" output=< Mar 20 11:21:40 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 11:21:40 crc kubenswrapper[4860]: > Mar 20 11:21:48 crc kubenswrapper[4860]: I0320 11:21:48.414108 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:21:48 crc kubenswrapper[4860]: E0320 11:21:48.415302 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:21:49 crc kubenswrapper[4860]: I0320 11:21:49.430482 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:49 crc kubenswrapper[4860]: I0320 11:21:49.477800 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:49 crc kubenswrapper[4860]: I0320 11:21:49.665126 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9gvjr"] Mar 20 11:21:50 crc kubenswrapper[4860]: I0320 11:21:50.554220 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9gvjr" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="registry-server" containerID="cri-o://a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd" gracePeriod=2 Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.513651 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.564309 4860 generic.go:334] "Generic (PLEG): container finished" podID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerID="a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd" exitCode=0 Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.564369 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gvjr" event={"ID":"8fa1c68c-e71f-456c-a53f-1ba28dd3952f","Type":"ContainerDied","Data":"a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd"} Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.564407 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gvjr" event={"ID":"8fa1c68c-e71f-456c-a53f-1ba28dd3952f","Type":"ContainerDied","Data":"fcc3ad862451d2531d00ef7f4051c1f59243588e81decdce68473cd448e73080"} Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.564430 4860 scope.go:117] "RemoveContainer" containerID="a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.564657 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gvjr" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.590299 4860 scope.go:117] "RemoveContainer" containerID="a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.615738 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-utilities\") pod \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.615887 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-catalog-content\") pod \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.616000 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5zsx\" (UniqueName: \"kubernetes.io/projected/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-kube-api-access-v5zsx\") pod \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\" (UID: \"8fa1c68c-e71f-456c-a53f-1ba28dd3952f\") " Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.616069 4860 scope.go:117] "RemoveContainer" containerID="bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.617560 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-utilities" (OuterVolumeSpecName: "utilities") pod "8fa1c68c-e71f-456c-a53f-1ba28dd3952f" (UID: "8fa1c68c-e71f-456c-a53f-1ba28dd3952f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.633731 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-kube-api-access-v5zsx" (OuterVolumeSpecName: "kube-api-access-v5zsx") pod "8fa1c68c-e71f-456c-a53f-1ba28dd3952f" (UID: "8fa1c68c-e71f-456c-a53f-1ba28dd3952f"). InnerVolumeSpecName "kube-api-access-v5zsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.688421 4860 scope.go:117] "RemoveContainer" containerID="a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd" Mar 20 11:21:51 crc kubenswrapper[4860]: E0320 11:21:51.706880 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd\": container with ID starting with a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd not found: ID does not exist" containerID="a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.706946 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd"} err="failed to get container status \"a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd\": rpc error: code = NotFound desc = could not find container \"a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd\": container with ID starting with a0b6912b94144e45036f3769d17d2bd8ade70f17a27dfd648678da71cf95cfbd not found: ID does not exist" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.707071 4860 scope.go:117] "RemoveContainer" containerID="a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f" Mar 20 11:21:51 crc kubenswrapper[4860]: E0320 11:21:51.707558 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f\": container with ID starting with a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f not found: ID does not exist" containerID="a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.707592 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f"} err="failed to get container status \"a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f\": rpc error: code = NotFound desc = could not find container \"a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f\": container with ID starting with a15ba701649b81b12d2b014a552468aa2bfff91a5766797a31426e0bf11ff89f not found: ID does not exist" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.707609 4860 scope.go:117] "RemoveContainer" containerID="bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf" Mar 20 11:21:51 crc kubenswrapper[4860]: E0320 11:21:51.708156 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf\": container with ID starting with bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf not found: ID does not exist" containerID="bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.708200 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf"} err="failed to get container status \"bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf\": rpc error: code = NotFound desc = could not find container \"bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf\": container with ID starting with bd1b68278e2421bec44699bb69debaba80a232532bb0174492da79e52abc9baf not found: ID does not exist" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.717582 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5zsx\" (UniqueName: \"kubernetes.io/projected/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-kube-api-access-v5zsx\") on node \"crc\" DevicePath \"\"" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.717621 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.764608 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fa1c68c-e71f-456c-a53f-1ba28dd3952f" (UID: "8fa1c68c-e71f-456c-a53f-1ba28dd3952f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.819441 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa1c68c-e71f-456c-a53f-1ba28dd3952f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.898987 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9gvjr"] Mar 20 11:21:51 crc kubenswrapper[4860]: I0320 11:21:51.913392 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9gvjr"] Mar 20 11:21:53 crc kubenswrapper[4860]: I0320 11:21:53.424284 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" path="/var/lib/kubelet/pods/8fa1c68c-e71f-456c-a53f-1ba28dd3952f/volumes" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.149561 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566762-4zgtb"] Mar 20 11:22:00 crc kubenswrapper[4860]: E0320 11:22:00.150936 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="registry-server" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.150956 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="registry-server" Mar 20 11:22:00 crc kubenswrapper[4860]: E0320 11:22:00.150980 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="extract-utilities" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.150988 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="extract-utilities" Mar 20 11:22:00 crc kubenswrapper[4860]: E0320 11:22:00.151022 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="extract-content" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.151031 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="extract-content" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.151252 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fa1c68c-e71f-456c-a53f-1ba28dd3952f" containerName="registry-server" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.151982 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.155131 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.155707 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.156477 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.161681 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-4zgtb"] Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.262322 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5bnl\" (UniqueName: \"kubernetes.io/projected/92fe5f45-6751-47ad-ba62-ce45b44f7460-kube-api-access-w5bnl\") pod \"auto-csr-approver-29566762-4zgtb\" (UID: \"92fe5f45-6751-47ad-ba62-ce45b44f7460\") " pod="openshift-infra/auto-csr-approver-29566762-4zgtb" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.363678 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5bnl\" (UniqueName: \"kubernetes.io/projected/92fe5f45-6751-47ad-ba62-ce45b44f7460-kube-api-access-w5bnl\") pod \"auto-csr-approver-29566762-4zgtb\" (UID: \"92fe5f45-6751-47ad-ba62-ce45b44f7460\") " pod="openshift-infra/auto-csr-approver-29566762-4zgtb" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.390038 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5bnl\" (UniqueName: \"kubernetes.io/projected/92fe5f45-6751-47ad-ba62-ce45b44f7460-kube-api-access-w5bnl\") pod \"auto-csr-approver-29566762-4zgtb\" (UID: \"92fe5f45-6751-47ad-ba62-ce45b44f7460\") " pod="openshift-infra/auto-csr-approver-29566762-4zgtb" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.413910 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:22:00 crc kubenswrapper[4860]: E0320 11:22:00.414189 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.490422 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" Mar 20 11:22:00 crc kubenswrapper[4860]: I0320 11:22:00.993599 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-4zgtb"] Mar 20 11:22:01 crc kubenswrapper[4860]: I0320 11:22:01.666251 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" event={"ID":"92fe5f45-6751-47ad-ba62-ce45b44f7460","Type":"ContainerStarted","Data":"b24a66f2d341276b0e735c7142247ede073361f3f02e8652e5b3749b9ba79ed7"} Mar 20 11:22:03 crc kubenswrapper[4860]: I0320 11:22:03.683507 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" event={"ID":"92fe5f45-6751-47ad-ba62-ce45b44f7460","Type":"ContainerStarted","Data":"35b3d06c932db51a34b9c266a504a6d9855ee6f888b737047ccd7f4d521d88bd"} Mar 20 11:22:04 crc kubenswrapper[4860]: I0320 11:22:04.695980 4860 generic.go:334] "Generic (PLEG): container finished" podID="92fe5f45-6751-47ad-ba62-ce45b44f7460" containerID="35b3d06c932db51a34b9c266a504a6d9855ee6f888b737047ccd7f4d521d88bd" exitCode=0 Mar 20 11:22:04 crc kubenswrapper[4860]: I0320 11:22:04.696101 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" event={"ID":"92fe5f45-6751-47ad-ba62-ce45b44f7460","Type":"ContainerDied","Data":"35b3d06c932db51a34b9c266a504a6d9855ee6f888b737047ccd7f4d521d88bd"} Mar 20 11:22:05 crc kubenswrapper[4860]: I0320 11:22:05.036617 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" Mar 20 11:22:05 crc kubenswrapper[4860]: I0320 11:22:05.139770 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5bnl\" (UniqueName: \"kubernetes.io/projected/92fe5f45-6751-47ad-ba62-ce45b44f7460-kube-api-access-w5bnl\") pod \"92fe5f45-6751-47ad-ba62-ce45b44f7460\" (UID: \"92fe5f45-6751-47ad-ba62-ce45b44f7460\") " Mar 20 11:22:05 crc kubenswrapper[4860]: I0320 11:22:05.148634 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92fe5f45-6751-47ad-ba62-ce45b44f7460-kube-api-access-w5bnl" (OuterVolumeSpecName: "kube-api-access-w5bnl") pod "92fe5f45-6751-47ad-ba62-ce45b44f7460" (UID: "92fe5f45-6751-47ad-ba62-ce45b44f7460"). InnerVolumeSpecName "kube-api-access-w5bnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:22:05 crc kubenswrapper[4860]: I0320 11:22:05.241792 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5bnl\" (UniqueName: \"kubernetes.io/projected/92fe5f45-6751-47ad-ba62-ce45b44f7460-kube-api-access-w5bnl\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:05 crc kubenswrapper[4860]: I0320 11:22:05.708823 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" event={"ID":"92fe5f45-6751-47ad-ba62-ce45b44f7460","Type":"ContainerDied","Data":"b24a66f2d341276b0e735c7142247ede073361f3f02e8652e5b3749b9ba79ed7"} Mar 20 11:22:05 crc kubenswrapper[4860]: I0320 11:22:05.708869 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b24a66f2d341276b0e735c7142247ede073361f3f02e8652e5b3749b9ba79ed7" Mar 20 11:22:05 crc kubenswrapper[4860]: I0320 11:22:05.708999 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-4zgtb" Mar 20 11:22:06 crc kubenswrapper[4860]: I0320 11:22:06.115733 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-77l7w"] Mar 20 11:22:06 crc kubenswrapper[4860]: I0320 11:22:06.122184 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-77l7w"] Mar 20 11:22:07 crc kubenswrapper[4860]: I0320 11:22:07.421970 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="638de697-8881-4bb2-b204-2e87655dccbf" path="/var/lib/kubelet/pods/638de697-8881-4bb2-b204-2e87655dccbf/volumes" Mar 20 11:22:12 crc kubenswrapper[4860]: I0320 11:22:12.413703 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:22:12 crc kubenswrapper[4860]: E0320 11:22:12.414898 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.110099 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-55zcw"] Mar 20 11:22:23 crc kubenswrapper[4860]: E0320 11:22:23.111006 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92fe5f45-6751-47ad-ba62-ce45b44f7460" containerName="oc" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.111022 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="92fe5f45-6751-47ad-ba62-ce45b44f7460" containerName="oc" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.111206 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="92fe5f45-6751-47ad-ba62-ce45b44f7460" containerName="oc" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.112590 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.126347 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-55zcw"] Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.253050 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-utilities\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.253150 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx6q5\" (UniqueName: \"kubernetes.io/projected/4a827151-75d9-472f-9cd8-bd45629d4c42-kube-api-access-gx6q5\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.253438 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-catalog-content\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.355032 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-utilities\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.355111 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx6q5\" (UniqueName: \"kubernetes.io/projected/4a827151-75d9-472f-9cd8-bd45629d4c42-kube-api-access-gx6q5\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.355154 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-catalog-content\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.355675 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-catalog-content\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.356239 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-utilities\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.379298 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx6q5\" (UniqueName: \"kubernetes.io/projected/4a827151-75d9-472f-9cd8-bd45629d4c42-kube-api-access-gx6q5\") pod \"certified-operators-55zcw\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.413797 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:22:23 crc kubenswrapper[4860]: E0320 11:22:23.414164 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.441250 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:23 crc kubenswrapper[4860]: I0320 11:22:23.959916 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-55zcw"] Mar 20 11:22:24 crc kubenswrapper[4860]: I0320 11:22:24.864701 4860 generic.go:334] "Generic (PLEG): container finished" podID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerID="7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726" exitCode=0 Mar 20 11:22:24 crc kubenswrapper[4860]: I0320 11:22:24.865255 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55zcw" event={"ID":"4a827151-75d9-472f-9cd8-bd45629d4c42","Type":"ContainerDied","Data":"7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726"} Mar 20 11:22:24 crc kubenswrapper[4860]: I0320 11:22:24.865292 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55zcw" event={"ID":"4a827151-75d9-472f-9cd8-bd45629d4c42","Type":"ContainerStarted","Data":"b5fa759c5bdc4dc59a530ac28ff28e464c1717a4acb2944a6d76e0c54f12fb36"} Mar 20 11:22:25 crc kubenswrapper[4860]: I0320 11:22:25.878149 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55zcw" event={"ID":"4a827151-75d9-472f-9cd8-bd45629d4c42","Type":"ContainerStarted","Data":"c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e"} Mar 20 11:22:26 crc kubenswrapper[4860]: I0320 11:22:26.888938 4860 generic.go:334] "Generic (PLEG): container finished" podID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerID="c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e" exitCode=0 Mar 20 11:22:26 crc kubenswrapper[4860]: I0320 11:22:26.889002 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55zcw" event={"ID":"4a827151-75d9-472f-9cd8-bd45629d4c42","Type":"ContainerDied","Data":"c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e"} Mar 20 11:22:28 crc kubenswrapper[4860]: I0320 11:22:28.912081 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55zcw" event={"ID":"4a827151-75d9-472f-9cd8-bd45629d4c42","Type":"ContainerStarted","Data":"781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d"} Mar 20 11:22:28 crc kubenswrapper[4860]: I0320 11:22:28.940022 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-55zcw" podStartSLOduration=2.715966631 podStartE2EDuration="5.939993554s" podCreationTimestamp="2026-03-20 11:22:23 +0000 UTC" firstStartedPulling="2026-03-20 11:22:24.86715643 +0000 UTC m=+1669.088517328" lastFinishedPulling="2026-03-20 11:22:28.091183353 +0000 UTC m=+1672.312544251" observedRunningTime="2026-03-20 11:22:28.939991354 +0000 UTC m=+1673.161352262" watchObservedRunningTime="2026-03-20 11:22:28.939993554 +0000 UTC m=+1673.161354452" Mar 20 11:22:32 crc kubenswrapper[4860]: I0320 11:22:32.782512 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vlqnl"] Mar 20 11:22:32 crc kubenswrapper[4860]: I0320 11:22:32.784895 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:32 crc kubenswrapper[4860]: I0320 11:22:32.798842 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vlqnl"] Mar 20 11:22:32 crc kubenswrapper[4860]: I0320 11:22:32.905374 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-catalog-content\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:32 crc kubenswrapper[4860]: I0320 11:22:32.905445 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-utilities\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:32 crc kubenswrapper[4860]: I0320 11:22:32.905509 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bz6v\" (UniqueName: \"kubernetes.io/projected/8225bb93-169a-41cc-bdec-d466c6aa140f-kube-api-access-9bz6v\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.007667 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bz6v\" (UniqueName: \"kubernetes.io/projected/8225bb93-169a-41cc-bdec-d466c6aa140f-kube-api-access-9bz6v\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.007805 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-catalog-content\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.007841 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-utilities\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.008657 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-catalog-content\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.008756 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-utilities\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.031519 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bz6v\" (UniqueName: \"kubernetes.io/projected/8225bb93-169a-41cc-bdec-d466c6aa140f-kube-api-access-9bz6v\") pod \"redhat-marketplace-vlqnl\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.105801 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.442103 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.442643 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.491993 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.590466 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vlqnl"] Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.957482 4860 generic.go:334] "Generic (PLEG): container finished" podID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerID="0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1" exitCode=0 Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.957977 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vlqnl" event={"ID":"8225bb93-169a-41cc-bdec-d466c6aa140f","Type":"ContainerDied","Data":"0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1"} Mar 20 11:22:33 crc kubenswrapper[4860]: I0320 11:22:33.958407 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vlqnl" event={"ID":"8225bb93-169a-41cc-bdec-d466c6aa140f","Type":"ContainerStarted","Data":"ed55ee2e36d27fdb0b0342cb9867725ba680f9b8414c697e3fd56d92789efcf2"} Mar 20 11:22:34 crc kubenswrapper[4860]: I0320 11:22:34.004573 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:35 crc kubenswrapper[4860]: I0320 11:22:35.741563 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-55zcw"] Mar 20 11:22:35 crc kubenswrapper[4860]: I0320 11:22:35.982505 4860 generic.go:334] "Generic (PLEG): container finished" podID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerID="5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418" exitCode=0 Mar 20 11:22:35 crc kubenswrapper[4860]: I0320 11:22:35.982555 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vlqnl" event={"ID":"8225bb93-169a-41cc-bdec-d466c6aa140f","Type":"ContainerDied","Data":"5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418"} Mar 20 11:22:35 crc kubenswrapper[4860]: I0320 11:22:35.982814 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-55zcw" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="registry-server" containerID="cri-o://781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d" gracePeriod=2 Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.392070 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.471824 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx6q5\" (UniqueName: \"kubernetes.io/projected/4a827151-75d9-472f-9cd8-bd45629d4c42-kube-api-access-gx6q5\") pod \"4a827151-75d9-472f-9cd8-bd45629d4c42\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.471893 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-catalog-content\") pod \"4a827151-75d9-472f-9cd8-bd45629d4c42\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.471945 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-utilities\") pod \"4a827151-75d9-472f-9cd8-bd45629d4c42\" (UID: \"4a827151-75d9-472f-9cd8-bd45629d4c42\") " Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.473929 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-utilities" (OuterVolumeSpecName: "utilities") pod "4a827151-75d9-472f-9cd8-bd45629d4c42" (UID: "4a827151-75d9-472f-9cd8-bd45629d4c42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.481195 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a827151-75d9-472f-9cd8-bd45629d4c42-kube-api-access-gx6q5" (OuterVolumeSpecName: "kube-api-access-gx6q5") pod "4a827151-75d9-472f-9cd8-bd45629d4c42" (UID: "4a827151-75d9-472f-9cd8-bd45629d4c42"). InnerVolumeSpecName "kube-api-access-gx6q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.574517 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.574562 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx6q5\" (UniqueName: \"kubernetes.io/projected/4a827151-75d9-472f-9cd8-bd45629d4c42-kube-api-access-gx6q5\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:36 crc kubenswrapper[4860]: I0320 11:22:36.994040 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vlqnl" event={"ID":"8225bb93-169a-41cc-bdec-d466c6aa140f","Type":"ContainerStarted","Data":"5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc"} Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.000167 4860 generic.go:334] "Generic (PLEG): container finished" podID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerID="781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d" exitCode=0 Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.000263 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55zcw" event={"ID":"4a827151-75d9-472f-9cd8-bd45629d4c42","Type":"ContainerDied","Data":"781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d"} Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.000314 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55zcw" event={"ID":"4a827151-75d9-472f-9cd8-bd45629d4c42","Type":"ContainerDied","Data":"b5fa759c5bdc4dc59a530ac28ff28e464c1717a4acb2944a6d76e0c54f12fb36"} Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.000314 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55zcw" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.000341 4860 scope.go:117] "RemoveContainer" containerID="781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.028927 4860 scope.go:117] "RemoveContainer" containerID="c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.032864 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vlqnl" podStartSLOduration=2.492740575 podStartE2EDuration="5.032847578s" podCreationTimestamp="2026-03-20 11:22:32 +0000 UTC" firstStartedPulling="2026-03-20 11:22:33.959636096 +0000 UTC m=+1678.180997024" lastFinishedPulling="2026-03-20 11:22:36.499743109 +0000 UTC m=+1680.721104027" observedRunningTime="2026-03-20 11:22:37.023992677 +0000 UTC m=+1681.245353585" watchObservedRunningTime="2026-03-20 11:22:37.032847578 +0000 UTC m=+1681.254208476" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.052455 4860 scope.go:117] "RemoveContainer" containerID="7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.071544 4860 scope.go:117] "RemoveContainer" containerID="781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d" Mar 20 11:22:37 crc kubenswrapper[4860]: E0320 11:22:37.072205 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d\": container with ID starting with 781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d not found: ID does not exist" containerID="781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.072275 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d"} err="failed to get container status \"781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d\": rpc error: code = NotFound desc = could not find container \"781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d\": container with ID starting with 781b9d20ec35c7a0424aa30bd44f6e040f8d8d2cc92f8320ace80fd2b6217c2d not found: ID does not exist" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.072316 4860 scope.go:117] "RemoveContainer" containerID="c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e" Mar 20 11:22:37 crc kubenswrapper[4860]: E0320 11:22:37.072984 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e\": container with ID starting with c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e not found: ID does not exist" containerID="c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.073057 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e"} err="failed to get container status \"c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e\": rpc error: code = NotFound desc = could not find container \"c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e\": container with ID starting with c43e402dc03d939aa3cc72188039cb78ca9d23e9b95540d5201d55b2128b616e not found: ID does not exist" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.073120 4860 scope.go:117] "RemoveContainer" containerID="7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726" Mar 20 11:22:37 crc kubenswrapper[4860]: E0320 11:22:37.074755 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726\": container with ID starting with 7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726 not found: ID does not exist" containerID="7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.074807 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726"} err="failed to get container status \"7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726\": rpc error: code = NotFound desc = could not find container \"7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726\": container with ID starting with 7ca363352ba6feb8cf71be3a28e5c8eb8525d4e02376901124ce7391b7b5f726 not found: ID does not exist" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.349335 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a827151-75d9-472f-9cd8-bd45629d4c42" (UID: "4a827151-75d9-472f-9cd8-bd45629d4c42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.387924 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a827151-75d9-472f-9cd8-bd45629d4c42-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.418550 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:22:37 crc kubenswrapper[4860]: E0320 11:22:37.418871 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.624172 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-55zcw"] Mar 20 11:22:37 crc kubenswrapper[4860]: I0320 11:22:37.631465 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-55zcw"] Mar 20 11:22:39 crc kubenswrapper[4860]: I0320 11:22:39.447533 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" path="/var/lib/kubelet/pods/4a827151-75d9-472f-9cd8-bd45629d4c42/volumes" Mar 20 11:22:43 crc kubenswrapper[4860]: I0320 11:22:43.106545 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:43 crc kubenswrapper[4860]: I0320 11:22:43.107029 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:43 crc kubenswrapper[4860]: I0320 11:22:43.160166 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:44 crc kubenswrapper[4860]: I0320 11:22:44.128182 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:44 crc kubenswrapper[4860]: I0320 11:22:44.192447 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vlqnl"] Mar 20 11:22:46 crc kubenswrapper[4860]: I0320 11:22:46.085554 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vlqnl" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="registry-server" containerID="cri-o://5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc" gracePeriod=2 Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.081210 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.101509 4860 generic.go:334] "Generic (PLEG): container finished" podID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerID="5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc" exitCode=0 Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.101554 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vlqnl" event={"ID":"8225bb93-169a-41cc-bdec-d466c6aa140f","Type":"ContainerDied","Data":"5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc"} Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.101584 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vlqnl" event={"ID":"8225bb93-169a-41cc-bdec-d466c6aa140f","Type":"ContainerDied","Data":"ed55ee2e36d27fdb0b0342cb9867725ba680f9b8414c697e3fd56d92789efcf2"} Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.101604 4860 scope.go:117] "RemoveContainer" containerID="5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.101738 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vlqnl" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.133089 4860 scope.go:117] "RemoveContainer" containerID="5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.153183 4860 scope.go:117] "RemoveContainer" containerID="0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.156205 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bz6v\" (UniqueName: \"kubernetes.io/projected/8225bb93-169a-41cc-bdec-d466c6aa140f-kube-api-access-9bz6v\") pod \"8225bb93-169a-41cc-bdec-d466c6aa140f\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.156290 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-utilities\") pod \"8225bb93-169a-41cc-bdec-d466c6aa140f\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.156427 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-catalog-content\") pod \"8225bb93-169a-41cc-bdec-d466c6aa140f\" (UID: \"8225bb93-169a-41cc-bdec-d466c6aa140f\") " Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.157504 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-utilities" (OuterVolumeSpecName: "utilities") pod "8225bb93-169a-41cc-bdec-d466c6aa140f" (UID: "8225bb93-169a-41cc-bdec-d466c6aa140f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.164172 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8225bb93-169a-41cc-bdec-d466c6aa140f-kube-api-access-9bz6v" (OuterVolumeSpecName: "kube-api-access-9bz6v") pod "8225bb93-169a-41cc-bdec-d466c6aa140f" (UID: "8225bb93-169a-41cc-bdec-d466c6aa140f"). InnerVolumeSpecName "kube-api-access-9bz6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.182026 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8225bb93-169a-41cc-bdec-d466c6aa140f" (UID: "8225bb93-169a-41cc-bdec-d466c6aa140f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.205138 4860 scope.go:117] "RemoveContainer" containerID="5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc" Mar 20 11:22:47 crc kubenswrapper[4860]: E0320 11:22:47.205594 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc\": container with ID starting with 5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc not found: ID does not exist" containerID="5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.205648 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc"} err="failed to get container status \"5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc\": rpc error: code = NotFound desc = could not find container \"5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc\": container with ID starting with 5c233e03126b7d5f84d03611f41f80cb77491947a9c2bc764bf19465608658bc not found: ID does not exist" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.205678 4860 scope.go:117] "RemoveContainer" containerID="5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418" Mar 20 11:22:47 crc kubenswrapper[4860]: E0320 11:22:47.205902 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418\": container with ID starting with 5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418 not found: ID does not exist" containerID="5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.205933 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418"} err="failed to get container status \"5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418\": rpc error: code = NotFound desc = could not find container \"5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418\": container with ID starting with 5239da29342fce95cf3a754b1f1cafdd6623c84bb7b64c705d6fd9efc09e4418 not found: ID does not exist" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.205959 4860 scope.go:117] "RemoveContainer" containerID="0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1" Mar 20 11:22:47 crc kubenswrapper[4860]: E0320 11:22:47.206500 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1\": container with ID starting with 0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1 not found: ID does not exist" containerID="0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.206530 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1"} err="failed to get container status \"0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1\": rpc error: code = NotFound desc = could not find container \"0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1\": container with ID starting with 0890a49332d1578d7bf6d9f8b129c1bd02d5ea141b8a561aed20c925d7d250f1 not found: ID does not exist" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.258266 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.258311 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bz6v\" (UniqueName: \"kubernetes.io/projected/8225bb93-169a-41cc-bdec-d466c6aa140f-kube-api-access-9bz6v\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.258326 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8225bb93-169a-41cc-bdec-d466c6aa140f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.440385 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vlqnl"] Mar 20 11:22:47 crc kubenswrapper[4860]: I0320 11:22:47.450855 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vlqnl"] Mar 20 11:22:48 crc kubenswrapper[4860]: I0320 11:22:48.087826 4860 scope.go:117] "RemoveContainer" containerID="615c37395180628a3c76825ddb15312c7ceadec62513b183ca243bd28c96c9ed" Mar 20 11:22:48 crc kubenswrapper[4860]: I0320 11:22:48.413641 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:22:48 crc kubenswrapper[4860]: E0320 11:22:48.413941 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:22:49 crc kubenswrapper[4860]: I0320 11:22:49.432979 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" path="/var/lib/kubelet/pods/8225bb93-169a-41cc-bdec-d466c6aa140f/volumes" Mar 20 11:23:03 crc kubenswrapper[4860]: I0320 11:23:03.414907 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:23:03 crc kubenswrapper[4860]: E0320 11:23:03.416737 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:23:14 crc kubenswrapper[4860]: I0320 11:23:14.413688 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:23:14 crc kubenswrapper[4860]: E0320 11:23:14.414254 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:23:25 crc kubenswrapper[4860]: I0320 11:23:25.414021 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:23:25 crc kubenswrapper[4860]: E0320 11:23:25.414760 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:23:38 crc kubenswrapper[4860]: I0320 11:23:38.413116 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:23:38 crc kubenswrapper[4860]: E0320 11:23:38.414236 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:23:50 crc kubenswrapper[4860]: I0320 11:23:50.414075 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:23:50 crc kubenswrapper[4860]: E0320 11:23:50.415159 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.155817 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566764-lml7m"] Mar 20 11:24:00 crc kubenswrapper[4860]: E0320 11:24:00.157388 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157410 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4860]: E0320 11:24:00.157439 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="extract-content" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157449 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="extract-content" Mar 20 11:24:00 crc kubenswrapper[4860]: E0320 11:24:00.157462 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="extract-utilities" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157472 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="extract-utilities" Mar 20 11:24:00 crc kubenswrapper[4860]: E0320 11:24:00.157491 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157501 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4860]: E0320 11:24:00.157516 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="extract-utilities" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157524 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="extract-utilities" Mar 20 11:24:00 crc kubenswrapper[4860]: E0320 11:24:00.157535 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="extract-content" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157547 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="extract-content" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157779 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a827151-75d9-472f-9cd8-bd45629d4c42" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.157794 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="8225bb93-169a-41cc-bdec-d466c6aa140f" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.158520 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-lml7m" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.161359 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.161632 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.165114 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-lml7m"] Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.165987 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.286530 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2ntr\" (UniqueName: \"kubernetes.io/projected/6d332346-eeda-4316-9757-20948492ca2a-kube-api-access-p2ntr\") pod \"auto-csr-approver-29566764-lml7m\" (UID: \"6d332346-eeda-4316-9757-20948492ca2a\") " pod="openshift-infra/auto-csr-approver-29566764-lml7m" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.387698 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2ntr\" (UniqueName: \"kubernetes.io/projected/6d332346-eeda-4316-9757-20948492ca2a-kube-api-access-p2ntr\") pod \"auto-csr-approver-29566764-lml7m\" (UID: \"6d332346-eeda-4316-9757-20948492ca2a\") " pod="openshift-infra/auto-csr-approver-29566764-lml7m" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.408564 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2ntr\" (UniqueName: \"kubernetes.io/projected/6d332346-eeda-4316-9757-20948492ca2a-kube-api-access-p2ntr\") pod \"auto-csr-approver-29566764-lml7m\" (UID: \"6d332346-eeda-4316-9757-20948492ca2a\") " pod="openshift-infra/auto-csr-approver-29566764-lml7m" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.479836 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-lml7m" Mar 20 11:24:00 crc kubenswrapper[4860]: I0320 11:24:00.942140 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-lml7m"] Mar 20 11:24:01 crc kubenswrapper[4860]: I0320 11:24:01.731967 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-lml7m" event={"ID":"6d332346-eeda-4316-9757-20948492ca2a","Type":"ContainerStarted","Data":"fa166993a364e7b61fba777749a0d56e6dfca916c31d3e14b23fae61b008dc2c"} Mar 20 11:24:05 crc kubenswrapper[4860]: I0320 11:24:05.414181 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:24:05 crc kubenswrapper[4860]: E0320 11:24:05.414942 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:24:07 crc kubenswrapper[4860]: I0320 11:24:07.779936 4860 generic.go:334] "Generic (PLEG): container finished" podID="6d332346-eeda-4316-9757-20948492ca2a" containerID="c19bdb0d3c5267be319cea9d2984f965c5e932c087050f789ce73157db7c4694" exitCode=0 Mar 20 11:24:07 crc kubenswrapper[4860]: I0320 11:24:07.780466 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-lml7m" event={"ID":"6d332346-eeda-4316-9757-20948492ca2a","Type":"ContainerDied","Data":"c19bdb0d3c5267be319cea9d2984f965c5e932c087050f789ce73157db7c4694"} Mar 20 11:24:09 crc kubenswrapper[4860]: I0320 11:24:09.058886 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-lml7m" Mar 20 11:24:09 crc kubenswrapper[4860]: I0320 11:24:09.139598 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2ntr\" (UniqueName: \"kubernetes.io/projected/6d332346-eeda-4316-9757-20948492ca2a-kube-api-access-p2ntr\") pod \"6d332346-eeda-4316-9757-20948492ca2a\" (UID: \"6d332346-eeda-4316-9757-20948492ca2a\") " Mar 20 11:24:09 crc kubenswrapper[4860]: I0320 11:24:09.148332 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d332346-eeda-4316-9757-20948492ca2a-kube-api-access-p2ntr" (OuterVolumeSpecName: "kube-api-access-p2ntr") pod "6d332346-eeda-4316-9757-20948492ca2a" (UID: "6d332346-eeda-4316-9757-20948492ca2a"). InnerVolumeSpecName "kube-api-access-p2ntr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:24:09 crc kubenswrapper[4860]: I0320 11:24:09.242163 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2ntr\" (UniqueName: \"kubernetes.io/projected/6d332346-eeda-4316-9757-20948492ca2a-kube-api-access-p2ntr\") on node \"crc\" DevicePath \"\"" Mar 20 11:24:09 crc kubenswrapper[4860]: I0320 11:24:09.797055 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-lml7m" event={"ID":"6d332346-eeda-4316-9757-20948492ca2a","Type":"ContainerDied","Data":"fa166993a364e7b61fba777749a0d56e6dfca916c31d3e14b23fae61b008dc2c"} Mar 20 11:24:09 crc kubenswrapper[4860]: I0320 11:24:09.797465 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa166993a364e7b61fba777749a0d56e6dfca916c31d3e14b23fae61b008dc2c" Mar 20 11:24:09 crc kubenswrapper[4860]: I0320 11:24:09.797084 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-lml7m" Mar 20 11:24:10 crc kubenswrapper[4860]: I0320 11:24:10.146015 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-bblpr"] Mar 20 11:24:10 crc kubenswrapper[4860]: I0320 11:24:10.152143 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-bblpr"] Mar 20 11:24:11 crc kubenswrapper[4860]: I0320 11:24:11.423219 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d4db42a-d915-4c4d-a985-be77a5381514" path="/var/lib/kubelet/pods/3d4db42a-d915-4c4d-a985-be77a5381514/volumes" Mar 20 11:24:17 crc kubenswrapper[4860]: I0320 11:24:17.418581 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:24:17 crc kubenswrapper[4860]: E0320 11:24:17.419889 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:24:28 crc kubenswrapper[4860]: I0320 11:24:28.414156 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:24:28 crc kubenswrapper[4860]: E0320 11:24:28.415033 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:24:40 crc kubenswrapper[4860]: I0320 11:24:40.413745 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:24:40 crc kubenswrapper[4860]: E0320 11:24:40.415172 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:24:48 crc kubenswrapper[4860]: I0320 11:24:48.212119 4860 scope.go:117] "RemoveContainer" containerID="666fc76c19255af020ada26a1d756d00c6fc27b0113301cab647ad4c35e9ef0c" Mar 20 11:24:53 crc kubenswrapper[4860]: I0320 11:24:53.414006 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:24:53 crc kubenswrapper[4860]: E0320 11:24:53.414769 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:25:06 crc kubenswrapper[4860]: I0320 11:25:06.416092 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:25:06 crc kubenswrapper[4860]: E0320 11:25:06.418751 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:25:21 crc kubenswrapper[4860]: I0320 11:25:21.413257 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:25:21 crc kubenswrapper[4860]: E0320 11:25:21.414308 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:25:34 crc kubenswrapper[4860]: I0320 11:25:34.414001 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:25:34 crc kubenswrapper[4860]: E0320 11:25:34.415159 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:25:45 crc kubenswrapper[4860]: I0320 11:25:45.413443 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:25:45 crc kubenswrapper[4860]: E0320 11:25:45.414402 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:25:57 crc kubenswrapper[4860]: I0320 11:25:57.420128 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:25:57 crc kubenswrapper[4860]: E0320 11:25:57.421209 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.152395 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566766-fhwnn"] Mar 20 11:26:00 crc kubenswrapper[4860]: E0320 11:26:00.153412 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d332346-eeda-4316-9757-20948492ca2a" containerName="oc" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.153438 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d332346-eeda-4316-9757-20948492ca2a" containerName="oc" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.153638 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d332346-eeda-4316-9757-20948492ca2a" containerName="oc" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.154437 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-fhwnn" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.157732 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.158086 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.162698 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.164766 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prqr5\" (UniqueName: \"kubernetes.io/projected/8a62a673-bde2-4cf8-bae1-56252a15c71e-kube-api-access-prqr5\") pod \"auto-csr-approver-29566766-fhwnn\" (UID: \"8a62a673-bde2-4cf8-bae1-56252a15c71e\") " pod="openshift-infra/auto-csr-approver-29566766-fhwnn" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.166084 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-fhwnn"] Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.266155 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prqr5\" (UniqueName: \"kubernetes.io/projected/8a62a673-bde2-4cf8-bae1-56252a15c71e-kube-api-access-prqr5\") pod \"auto-csr-approver-29566766-fhwnn\" (UID: \"8a62a673-bde2-4cf8-bae1-56252a15c71e\") " pod="openshift-infra/auto-csr-approver-29566766-fhwnn" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.293140 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prqr5\" (UniqueName: \"kubernetes.io/projected/8a62a673-bde2-4cf8-bae1-56252a15c71e-kube-api-access-prqr5\") pod \"auto-csr-approver-29566766-fhwnn\" (UID: \"8a62a673-bde2-4cf8-bae1-56252a15c71e\") " pod="openshift-infra/auto-csr-approver-29566766-fhwnn" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.483671 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-fhwnn" Mar 20 11:26:00 crc kubenswrapper[4860]: I0320 11:26:00.929342 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-fhwnn"] Mar 20 11:26:01 crc kubenswrapper[4860]: I0320 11:26:01.721108 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-fhwnn" event={"ID":"8a62a673-bde2-4cf8-bae1-56252a15c71e","Type":"ContainerStarted","Data":"db97e9b10e9b35035d0ce847e43abebd1ff265f8bdd042a898818f167ac6ca2f"} Mar 20 11:26:02 crc kubenswrapper[4860]: I0320 11:26:02.731502 4860 generic.go:334] "Generic (PLEG): container finished" podID="8a62a673-bde2-4cf8-bae1-56252a15c71e" containerID="c8ce87256d0115d4f80012e1ce15b95a524f1b59f29da3d0c9299a0f68d2780e" exitCode=0 Mar 20 11:26:02 crc kubenswrapper[4860]: I0320 11:26:02.731559 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-fhwnn" event={"ID":"8a62a673-bde2-4cf8-bae1-56252a15c71e","Type":"ContainerDied","Data":"c8ce87256d0115d4f80012e1ce15b95a524f1b59f29da3d0c9299a0f68d2780e"} Mar 20 11:26:04 crc kubenswrapper[4860]: I0320 11:26:04.078008 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-fhwnn" Mar 20 11:26:04 crc kubenswrapper[4860]: I0320 11:26:04.242657 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prqr5\" (UniqueName: \"kubernetes.io/projected/8a62a673-bde2-4cf8-bae1-56252a15c71e-kube-api-access-prqr5\") pod \"8a62a673-bde2-4cf8-bae1-56252a15c71e\" (UID: \"8a62a673-bde2-4cf8-bae1-56252a15c71e\") " Mar 20 11:26:04 crc kubenswrapper[4860]: I0320 11:26:04.251801 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a62a673-bde2-4cf8-bae1-56252a15c71e-kube-api-access-prqr5" (OuterVolumeSpecName: "kube-api-access-prqr5") pod "8a62a673-bde2-4cf8-bae1-56252a15c71e" (UID: "8a62a673-bde2-4cf8-bae1-56252a15c71e"). InnerVolumeSpecName "kube-api-access-prqr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:26:04 crc kubenswrapper[4860]: I0320 11:26:04.344905 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prqr5\" (UniqueName: \"kubernetes.io/projected/8a62a673-bde2-4cf8-bae1-56252a15c71e-kube-api-access-prqr5\") on node \"crc\" DevicePath \"\"" Mar 20 11:26:04 crc kubenswrapper[4860]: I0320 11:26:04.750076 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-fhwnn" event={"ID":"8a62a673-bde2-4cf8-bae1-56252a15c71e","Type":"ContainerDied","Data":"db97e9b10e9b35035d0ce847e43abebd1ff265f8bdd042a898818f167ac6ca2f"} Mar 20 11:26:04 crc kubenswrapper[4860]: I0320 11:26:04.750585 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db97e9b10e9b35035d0ce847e43abebd1ff265f8bdd042a898818f167ac6ca2f" Mar 20 11:26:04 crc kubenswrapper[4860]: I0320 11:26:04.750115 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-fhwnn" Mar 20 11:26:05 crc kubenswrapper[4860]: I0320 11:26:05.151599 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-gs8qf"] Mar 20 11:26:05 crc kubenswrapper[4860]: I0320 11:26:05.157310 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-gs8qf"] Mar 20 11:26:05 crc kubenswrapper[4860]: I0320 11:26:05.422482 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02d4a854-e21a-46b4-976b-17645af17c8b" path="/var/lib/kubelet/pods/02d4a854-e21a-46b4-976b-17645af17c8b/volumes" Mar 20 11:26:11 crc kubenswrapper[4860]: I0320 11:26:11.413923 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:26:11 crc kubenswrapper[4860]: E0320 11:26:11.415050 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:26:25 crc kubenswrapper[4860]: I0320 11:26:25.413550 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:26:25 crc kubenswrapper[4860]: I0320 11:26:25.924622 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"b5a83b0edf3d114fb0c90b2b89dbd92039d6760397381d08a497f8b893a5686e"} Mar 20 11:26:48 crc kubenswrapper[4860]: I0320 11:26:48.320828 4860 scope.go:117] "RemoveContainer" containerID="b7ac94cb420b15b471714072b91c8c315997e55e973785d5b6c9d7428acd11e7" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.191791 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566768-r8jtv"] Mar 20 11:28:00 crc kubenswrapper[4860]: E0320 11:28:00.193150 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a62a673-bde2-4cf8-bae1-56252a15c71e" containerName="oc" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.193168 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a62a673-bde2-4cf8-bae1-56252a15c71e" containerName="oc" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.193412 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a62a673-bde2-4cf8-bae1-56252a15c71e" containerName="oc" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.194085 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.199527 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.199711 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.200199 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-r8jtv"] Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.201495 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.273624 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85vql\" (UniqueName: \"kubernetes.io/projected/e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2-kube-api-access-85vql\") pod \"auto-csr-approver-29566768-r8jtv\" (UID: \"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2\") " pod="openshift-infra/auto-csr-approver-29566768-r8jtv" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.374972 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85vql\" (UniqueName: \"kubernetes.io/projected/e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2-kube-api-access-85vql\") pod \"auto-csr-approver-29566768-r8jtv\" (UID: \"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2\") " pod="openshift-infra/auto-csr-approver-29566768-r8jtv" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.397748 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85vql\" (UniqueName: \"kubernetes.io/projected/e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2-kube-api-access-85vql\") pod \"auto-csr-approver-29566768-r8jtv\" (UID: \"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2\") " pod="openshift-infra/auto-csr-approver-29566768-r8jtv" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.517485 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.975285 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-r8jtv"] Mar 20 11:28:00 crc kubenswrapper[4860]: I0320 11:28:00.986070 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:28:01 crc kubenswrapper[4860]: I0320 11:28:01.725783 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" event={"ID":"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2","Type":"ContainerStarted","Data":"a9a24491117dfbd064a7ae396bca0316960cd788de751fbcc971c98dba991ca6"} Mar 20 11:28:02 crc kubenswrapper[4860]: I0320 11:28:02.735214 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" event={"ID":"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2","Type":"ContainerStarted","Data":"43beffcf03fad4254e2bb3cab94aa4a32cf894399c7d22a72063beb87aa2fc0f"} Mar 20 11:28:02 crc kubenswrapper[4860]: I0320 11:28:02.758006 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" podStartSLOduration=1.425143617 podStartE2EDuration="2.757982498s" podCreationTimestamp="2026-03-20 11:28:00 +0000 UTC" firstStartedPulling="2026-03-20 11:28:00.985762892 +0000 UTC m=+2005.207123800" lastFinishedPulling="2026-03-20 11:28:02.318601773 +0000 UTC m=+2006.539962681" observedRunningTime="2026-03-20 11:28:02.75472627 +0000 UTC m=+2006.976087168" watchObservedRunningTime="2026-03-20 11:28:02.757982498 +0000 UTC m=+2006.979343396" Mar 20 11:28:03 crc kubenswrapper[4860]: I0320 11:28:03.744502 4860 generic.go:334] "Generic (PLEG): container finished" podID="e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2" containerID="43beffcf03fad4254e2bb3cab94aa4a32cf894399c7d22a72063beb87aa2fc0f" exitCode=0 Mar 20 11:28:03 crc kubenswrapper[4860]: I0320 11:28:03.744573 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" event={"ID":"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2","Type":"ContainerDied","Data":"43beffcf03fad4254e2bb3cab94aa4a32cf894399c7d22a72063beb87aa2fc0f"} Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.089136 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.275272 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85vql\" (UniqueName: \"kubernetes.io/projected/e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2-kube-api-access-85vql\") pod \"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2\" (UID: \"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2\") " Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.282436 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2-kube-api-access-85vql" (OuterVolumeSpecName: "kube-api-access-85vql") pod "e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2" (UID: "e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2"). InnerVolumeSpecName "kube-api-access-85vql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.377657 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85vql\" (UniqueName: \"kubernetes.io/projected/e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2-kube-api-access-85vql\") on node \"crc\" DevicePath \"\"" Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.761326 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" event={"ID":"e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2","Type":"ContainerDied","Data":"a9a24491117dfbd064a7ae396bca0316960cd788de751fbcc971c98dba991ca6"} Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.761780 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9a24491117dfbd064a7ae396bca0316960cd788de751fbcc971c98dba991ca6" Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.761399 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-r8jtv" Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.826941 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-4zgtb"] Mar 20 11:28:05 crc kubenswrapper[4860]: I0320 11:28:05.833741 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-4zgtb"] Mar 20 11:28:07 crc kubenswrapper[4860]: I0320 11:28:07.422792 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92fe5f45-6751-47ad-ba62-ce45b44f7460" path="/var/lib/kubelet/pods/92fe5f45-6751-47ad-ba62-ce45b44f7460/volumes" Mar 20 11:28:48 crc kubenswrapper[4860]: I0320 11:28:48.423803 4860 scope.go:117] "RemoveContainer" containerID="35b3d06c932db51a34b9c266a504a6d9855ee6f888b737047ccd7f4d521d88bd" Mar 20 11:28:52 crc kubenswrapper[4860]: I0320 11:28:52.344262 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:28:52 crc kubenswrapper[4860]: I0320 11:28:52.344764 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:29:22 crc kubenswrapper[4860]: I0320 11:29:22.344767 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:29:22 crc kubenswrapper[4860]: I0320 11:29:22.345830 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.344136 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.344873 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.344993 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.346458 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b5a83b0edf3d114fb0c90b2b89dbd92039d6760397381d08a497f8b893a5686e"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.346617 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://b5a83b0edf3d114fb0c90b2b89dbd92039d6760397381d08a497f8b893a5686e" gracePeriod=600 Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.651086 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="b5a83b0edf3d114fb0c90b2b89dbd92039d6760397381d08a497f8b893a5686e" exitCode=0 Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.651192 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"b5a83b0edf3d114fb0c90b2b89dbd92039d6760397381d08a497f8b893a5686e"} Mar 20 11:29:52 crc kubenswrapper[4860]: I0320 11:29:52.651950 4860 scope.go:117] "RemoveContainer" containerID="07ab4f7bad4a73673176b5c6eb66987e832ab6b48e2270c87816fed93c5b31d4" Mar 20 11:29:53 crc kubenswrapper[4860]: I0320 11:29:53.664940 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67"} Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.170702 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg"] Mar 20 11:30:00 crc kubenswrapper[4860]: E0320 11:30:00.171917 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2" containerName="oc" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.171936 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2" containerName="oc" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.172115 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2" containerName="oc" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.172870 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.175801 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.176619 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.183688 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566770-bmm72"] Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.185485 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-bmm72" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.191955 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.192286 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.192370 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg"] Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.193245 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.200805 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-bmm72"] Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.216882 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wg4p\" (UniqueName: \"kubernetes.io/projected/490c18f1-2364-4847-a7c9-5f603a7dbde2-kube-api-access-4wg4p\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.216944 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/490c18f1-2364-4847-a7c9-5f603a7dbde2-secret-volume\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.216981 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/490c18f1-2364-4847-a7c9-5f603a7dbde2-config-volume\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.217041 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fkjq\" (UniqueName: \"kubernetes.io/projected/d911fad2-83cb-46e3-8f48-eb6f4b0e5605-kube-api-access-9fkjq\") pod \"auto-csr-approver-29566770-bmm72\" (UID: \"d911fad2-83cb-46e3-8f48-eb6f4b0e5605\") " pod="openshift-infra/auto-csr-approver-29566770-bmm72" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.317763 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fkjq\" (UniqueName: \"kubernetes.io/projected/d911fad2-83cb-46e3-8f48-eb6f4b0e5605-kube-api-access-9fkjq\") pod \"auto-csr-approver-29566770-bmm72\" (UID: \"d911fad2-83cb-46e3-8f48-eb6f4b0e5605\") " pod="openshift-infra/auto-csr-approver-29566770-bmm72" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.318189 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wg4p\" (UniqueName: \"kubernetes.io/projected/490c18f1-2364-4847-a7c9-5f603a7dbde2-kube-api-access-4wg4p\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.318378 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/490c18f1-2364-4847-a7c9-5f603a7dbde2-secret-volume\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.318520 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/490c18f1-2364-4847-a7c9-5f603a7dbde2-config-volume\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.319938 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/490c18f1-2364-4847-a7c9-5f603a7dbde2-config-volume\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.326982 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/490c18f1-2364-4847-a7c9-5f603a7dbde2-secret-volume\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.336046 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fkjq\" (UniqueName: \"kubernetes.io/projected/d911fad2-83cb-46e3-8f48-eb6f4b0e5605-kube-api-access-9fkjq\") pod \"auto-csr-approver-29566770-bmm72\" (UID: \"d911fad2-83cb-46e3-8f48-eb6f4b0e5605\") " pod="openshift-infra/auto-csr-approver-29566770-bmm72" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.337074 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wg4p\" (UniqueName: \"kubernetes.io/projected/490c18f1-2364-4847-a7c9-5f603a7dbde2-kube-api-access-4wg4p\") pod \"collect-profiles-29566770-n4lhg\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.497025 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.514460 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-bmm72" Mar 20 11:30:00 crc kubenswrapper[4860]: I0320 11:30:00.961309 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-bmm72"] Mar 20 11:30:01 crc kubenswrapper[4860]: I0320 11:30:01.009512 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg"] Mar 20 11:30:01 crc kubenswrapper[4860]: W0320 11:30:01.017166 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod490c18f1_2364_4847_a7c9_5f603a7dbde2.slice/crio-454f5c1f9a974ed3f7b49aa7b53c6dd78a91694d9d3e82fb945db64b5ca62fa3 WatchSource:0}: Error finding container 454f5c1f9a974ed3f7b49aa7b53c6dd78a91694d9d3e82fb945db64b5ca62fa3: Status 404 returned error can't find the container with id 454f5c1f9a974ed3f7b49aa7b53c6dd78a91694d9d3e82fb945db64b5ca62fa3 Mar 20 11:30:01 crc kubenswrapper[4860]: I0320 11:30:01.763635 4860 generic.go:334] "Generic (PLEG): container finished" podID="490c18f1-2364-4847-a7c9-5f603a7dbde2" containerID="dea0662d075d177bbae1d376b3ebc31bedea7bf86ce0257b207c8d237e8238e0" exitCode=0 Mar 20 11:30:01 crc kubenswrapper[4860]: I0320 11:30:01.763756 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" event={"ID":"490c18f1-2364-4847-a7c9-5f603a7dbde2","Type":"ContainerDied","Data":"dea0662d075d177bbae1d376b3ebc31bedea7bf86ce0257b207c8d237e8238e0"} Mar 20 11:30:01 crc kubenswrapper[4860]: I0320 11:30:01.764565 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" event={"ID":"490c18f1-2364-4847-a7c9-5f603a7dbde2","Type":"ContainerStarted","Data":"454f5c1f9a974ed3f7b49aa7b53c6dd78a91694d9d3e82fb945db64b5ca62fa3"} Mar 20 11:30:01 crc kubenswrapper[4860]: I0320 11:30:01.766644 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566770-bmm72" event={"ID":"d911fad2-83cb-46e3-8f48-eb6f4b0e5605","Type":"ContainerStarted","Data":"1ee4dc5a50bbdc832ddc677756dde3d638c21cf4eb2a169831ce07b51562b0e0"} Mar 20 11:30:02 crc kubenswrapper[4860]: I0320 11:30:02.775058 4860 generic.go:334] "Generic (PLEG): container finished" podID="d911fad2-83cb-46e3-8f48-eb6f4b0e5605" containerID="6c25fc89bb9294757bf7b8ce97118d32231c76b11a8724a70385e83cc510600a" exitCode=0 Mar 20 11:30:02 crc kubenswrapper[4860]: I0320 11:30:02.775111 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566770-bmm72" event={"ID":"d911fad2-83cb-46e3-8f48-eb6f4b0e5605","Type":"ContainerDied","Data":"6c25fc89bb9294757bf7b8ce97118d32231c76b11a8724a70385e83cc510600a"} Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.070764 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.173803 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/490c18f1-2364-4847-a7c9-5f603a7dbde2-config-volume\") pod \"490c18f1-2364-4847-a7c9-5f603a7dbde2\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.173864 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/490c18f1-2364-4847-a7c9-5f603a7dbde2-secret-volume\") pod \"490c18f1-2364-4847-a7c9-5f603a7dbde2\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.173934 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wg4p\" (UniqueName: \"kubernetes.io/projected/490c18f1-2364-4847-a7c9-5f603a7dbde2-kube-api-access-4wg4p\") pod \"490c18f1-2364-4847-a7c9-5f603a7dbde2\" (UID: \"490c18f1-2364-4847-a7c9-5f603a7dbde2\") " Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.174906 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/490c18f1-2364-4847-a7c9-5f603a7dbde2-config-volume" (OuterVolumeSpecName: "config-volume") pod "490c18f1-2364-4847-a7c9-5f603a7dbde2" (UID: "490c18f1-2364-4847-a7c9-5f603a7dbde2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.180498 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/490c18f1-2364-4847-a7c9-5f603a7dbde2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "490c18f1-2364-4847-a7c9-5f603a7dbde2" (UID: "490c18f1-2364-4847-a7c9-5f603a7dbde2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.180700 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490c18f1-2364-4847-a7c9-5f603a7dbde2-kube-api-access-4wg4p" (OuterVolumeSpecName: "kube-api-access-4wg4p") pod "490c18f1-2364-4847-a7c9-5f603a7dbde2" (UID: "490c18f1-2364-4847-a7c9-5f603a7dbde2"). InnerVolumeSpecName "kube-api-access-4wg4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.275186 4860 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/490c18f1-2364-4847-a7c9-5f603a7dbde2-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.275262 4860 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/490c18f1-2364-4847-a7c9-5f603a7dbde2-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.275275 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wg4p\" (UniqueName: \"kubernetes.io/projected/490c18f1-2364-4847-a7c9-5f603a7dbde2-kube-api-access-4wg4p\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.788076 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.788841 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-n4lhg" event={"ID":"490c18f1-2364-4847-a7c9-5f603a7dbde2","Type":"ContainerDied","Data":"454f5c1f9a974ed3f7b49aa7b53c6dd78a91694d9d3e82fb945db64b5ca62fa3"} Mar 20 11:30:03 crc kubenswrapper[4860]: I0320 11:30:03.788868 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="454f5c1f9a974ed3f7b49aa7b53c6dd78a91694d9d3e82fb945db64b5ca62fa3" Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.046768 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-bmm72" Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.145835 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9"] Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.151376 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-d6wf9"] Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.224618 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fkjq\" (UniqueName: \"kubernetes.io/projected/d911fad2-83cb-46e3-8f48-eb6f4b0e5605-kube-api-access-9fkjq\") pod \"d911fad2-83cb-46e3-8f48-eb6f4b0e5605\" (UID: \"d911fad2-83cb-46e3-8f48-eb6f4b0e5605\") " Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.228089 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d911fad2-83cb-46e3-8f48-eb6f4b0e5605-kube-api-access-9fkjq" (OuterVolumeSpecName: "kube-api-access-9fkjq") pod "d911fad2-83cb-46e3-8f48-eb6f4b0e5605" (UID: "d911fad2-83cb-46e3-8f48-eb6f4b0e5605"). InnerVolumeSpecName "kube-api-access-9fkjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.327101 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fkjq\" (UniqueName: \"kubernetes.io/projected/d911fad2-83cb-46e3-8f48-eb6f4b0e5605-kube-api-access-9fkjq\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.796645 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566770-bmm72" event={"ID":"d911fad2-83cb-46e3-8f48-eb6f4b0e5605","Type":"ContainerDied","Data":"1ee4dc5a50bbdc832ddc677756dde3d638c21cf4eb2a169831ce07b51562b0e0"} Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.796699 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ee4dc5a50bbdc832ddc677756dde3d638c21cf4eb2a169831ce07b51562b0e0" Mar 20 11:30:04 crc kubenswrapper[4860]: I0320 11:30:04.796751 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-bmm72" Mar 20 11:30:05 crc kubenswrapper[4860]: I0320 11:30:05.101675 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-lml7m"] Mar 20 11:30:05 crc kubenswrapper[4860]: I0320 11:30:05.108091 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-lml7m"] Mar 20 11:30:05 crc kubenswrapper[4860]: I0320 11:30:05.424091 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="437c32d4-4b5f-4657-86d6-5214e3bfc01f" path="/var/lib/kubelet/pods/437c32d4-4b5f-4657-86d6-5214e3bfc01f/volumes" Mar 20 11:30:05 crc kubenswrapper[4860]: I0320 11:30:05.425004 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d332346-eeda-4316-9757-20948492ca2a" path="/var/lib/kubelet/pods/6d332346-eeda-4316-9757-20948492ca2a/volumes" Mar 20 11:30:48 crc kubenswrapper[4860]: I0320 11:30:48.539192 4860 scope.go:117] "RemoveContainer" containerID="c19bdb0d3c5267be319cea9d2984f965c5e932c087050f789ce73157db7c4694" Mar 20 11:30:48 crc kubenswrapper[4860]: I0320 11:30:48.592974 4860 scope.go:117] "RemoveContainer" containerID="509a0ab6073b8f241ed054d972f10c10904777731b271c4522d9caaf55b66c8c" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.499626 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nvhg7"] Mar 20 11:31:13 crc kubenswrapper[4860]: E0320 11:31:13.500711 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490c18f1-2364-4847-a7c9-5f603a7dbde2" containerName="collect-profiles" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.500731 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="490c18f1-2364-4847-a7c9-5f603a7dbde2" containerName="collect-profiles" Mar 20 11:31:13 crc kubenswrapper[4860]: E0320 11:31:13.500760 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d911fad2-83cb-46e3-8f48-eb6f4b0e5605" containerName="oc" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.500767 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="d911fad2-83cb-46e3-8f48-eb6f4b0e5605" containerName="oc" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.500954 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="d911fad2-83cb-46e3-8f48-eb6f4b0e5605" containerName="oc" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.500984 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="490c18f1-2364-4847-a7c9-5f603a7dbde2" containerName="collect-profiles" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.502315 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.520982 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nvhg7"] Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.702046 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ab38144-c30d-4aed-884c-8ace682fe5ea-utilities\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.702134 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42f4q\" (UniqueName: \"kubernetes.io/projected/4ab38144-c30d-4aed-884c-8ace682fe5ea-kube-api-access-42f4q\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.702360 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ab38144-c30d-4aed-884c-8ace682fe5ea-catalog-content\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.804294 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ab38144-c30d-4aed-884c-8ace682fe5ea-utilities\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.804398 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42f4q\" (UniqueName: \"kubernetes.io/projected/4ab38144-c30d-4aed-884c-8ace682fe5ea-kube-api-access-42f4q\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.804455 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ab38144-c30d-4aed-884c-8ace682fe5ea-catalog-content\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.805246 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ab38144-c30d-4aed-884c-8ace682fe5ea-utilities\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.805321 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ab38144-c30d-4aed-884c-8ace682fe5ea-catalog-content\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:13 crc kubenswrapper[4860]: I0320 11:31:13.831562 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42f4q\" (UniqueName: \"kubernetes.io/projected/4ab38144-c30d-4aed-884c-8ace682fe5ea-kube-api-access-42f4q\") pod \"community-operators-nvhg7\" (UID: \"4ab38144-c30d-4aed-884c-8ace682fe5ea\") " pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:14 crc kubenswrapper[4860]: I0320 11:31:14.127808 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:14 crc kubenswrapper[4860]: I0320 11:31:14.574154 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nvhg7"] Mar 20 11:31:15 crc kubenswrapper[4860]: I0320 11:31:15.406657 4860 generic.go:334] "Generic (PLEG): container finished" podID="4ab38144-c30d-4aed-884c-8ace682fe5ea" containerID="107cbe0685170349259a50f59931a9f77bb6c0d4533641b8385370d8b9ab6b8f" exitCode=0 Mar 20 11:31:15 crc kubenswrapper[4860]: I0320 11:31:15.406943 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvhg7" event={"ID":"4ab38144-c30d-4aed-884c-8ace682fe5ea","Type":"ContainerDied","Data":"107cbe0685170349259a50f59931a9f77bb6c0d4533641b8385370d8b9ab6b8f"} Mar 20 11:31:15 crc kubenswrapper[4860]: I0320 11:31:15.407178 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvhg7" event={"ID":"4ab38144-c30d-4aed-884c-8ace682fe5ea","Type":"ContainerStarted","Data":"86187c0243067078a3a74947d273e6d6198228ed2ad14a70469845cf59fe1145"} Mar 20 11:31:20 crc kubenswrapper[4860]: I0320 11:31:20.447377 4860 generic.go:334] "Generic (PLEG): container finished" podID="4ab38144-c30d-4aed-884c-8ace682fe5ea" containerID="853c52a81e8d09dce968d126eef52abf63c6037a7202e7f785694020dbc92c61" exitCode=0 Mar 20 11:31:20 crc kubenswrapper[4860]: I0320 11:31:20.447434 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvhg7" event={"ID":"4ab38144-c30d-4aed-884c-8ace682fe5ea","Type":"ContainerDied","Data":"853c52a81e8d09dce968d126eef52abf63c6037a7202e7f785694020dbc92c61"} Mar 20 11:31:21 crc kubenswrapper[4860]: I0320 11:31:21.458718 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvhg7" event={"ID":"4ab38144-c30d-4aed-884c-8ace682fe5ea","Type":"ContainerStarted","Data":"75ebac83878486e06b87ecb22c6114329904482ce09343e98ff64010821eafca"} Mar 20 11:31:24 crc kubenswrapper[4860]: I0320 11:31:24.128403 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:24 crc kubenswrapper[4860]: I0320 11:31:24.128942 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:24 crc kubenswrapper[4860]: I0320 11:31:24.202580 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:24 crc kubenswrapper[4860]: I0320 11:31:24.229398 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nvhg7" podStartSLOduration=5.701737457 podStartE2EDuration="11.22937799s" podCreationTimestamp="2026-03-20 11:31:13 +0000 UTC" firstStartedPulling="2026-03-20 11:31:15.409081237 +0000 UTC m=+2199.630442135" lastFinishedPulling="2026-03-20 11:31:20.93672177 +0000 UTC m=+2205.158082668" observedRunningTime="2026-03-20 11:31:21.47819002 +0000 UTC m=+2205.699550948" watchObservedRunningTime="2026-03-20 11:31:24.22937799 +0000 UTC m=+2208.450738878" Mar 20 11:31:34 crc kubenswrapper[4860]: I0320 11:31:34.287907 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nvhg7" Mar 20 11:31:34 crc kubenswrapper[4860]: I0320 11:31:34.386540 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nvhg7"] Mar 20 11:31:34 crc kubenswrapper[4860]: I0320 11:31:34.451639 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s2x6p"] Mar 20 11:31:34 crc kubenswrapper[4860]: I0320 11:31:34.452021 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s2x6p" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="registry-server" containerID="cri-o://c529e06c44dcbe461813fdd459369a14b6b219590561c7a4393e8a5bf49d0970" gracePeriod=2 Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.617633 4860 generic.go:334] "Generic (PLEG): container finished" podID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerID="c529e06c44dcbe461813fdd459369a14b6b219590561c7a4393e8a5bf49d0970" exitCode=0 Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.617724 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x6p" event={"ID":"da2b2cab-e4d8-48ed-b198-7aff45927348","Type":"ContainerDied","Data":"c529e06c44dcbe461813fdd459369a14b6b219590561c7a4393e8a5bf49d0970"} Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.689936 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.790998 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-utilities\") pod \"da2b2cab-e4d8-48ed-b198-7aff45927348\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.791068 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-catalog-content\") pod \"da2b2cab-e4d8-48ed-b198-7aff45927348\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.791119 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-267pr\" (UniqueName: \"kubernetes.io/projected/da2b2cab-e4d8-48ed-b198-7aff45927348-kube-api-access-267pr\") pod \"da2b2cab-e4d8-48ed-b198-7aff45927348\" (UID: \"da2b2cab-e4d8-48ed-b198-7aff45927348\") " Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.792359 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-utilities" (OuterVolumeSpecName: "utilities") pod "da2b2cab-e4d8-48ed-b198-7aff45927348" (UID: "da2b2cab-e4d8-48ed-b198-7aff45927348"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.823913 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2b2cab-e4d8-48ed-b198-7aff45927348-kube-api-access-267pr" (OuterVolumeSpecName: "kube-api-access-267pr") pod "da2b2cab-e4d8-48ed-b198-7aff45927348" (UID: "da2b2cab-e4d8-48ed-b198-7aff45927348"). InnerVolumeSpecName "kube-api-access-267pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.874015 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da2b2cab-e4d8-48ed-b198-7aff45927348" (UID: "da2b2cab-e4d8-48ed-b198-7aff45927348"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.892305 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.892349 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2b2cab-e4d8-48ed-b198-7aff45927348-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:31:35 crc kubenswrapper[4860]: I0320 11:31:35.892363 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-267pr\" (UniqueName: \"kubernetes.io/projected/da2b2cab-e4d8-48ed-b198-7aff45927348-kube-api-access-267pr\") on node \"crc\" DevicePath \"\"" Mar 20 11:31:36 crc kubenswrapper[4860]: I0320 11:31:36.630665 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s2x6p" event={"ID":"da2b2cab-e4d8-48ed-b198-7aff45927348","Type":"ContainerDied","Data":"0985eda395c30cb4fc11c5a030b8aabf733cd8d60366ed8ecb07d45313940c24"} Mar 20 11:31:36 crc kubenswrapper[4860]: I0320 11:31:36.630743 4860 scope.go:117] "RemoveContainer" containerID="c529e06c44dcbe461813fdd459369a14b6b219590561c7a4393e8a5bf49d0970" Mar 20 11:31:36 crc kubenswrapper[4860]: I0320 11:31:36.630755 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s2x6p" Mar 20 11:31:36 crc kubenswrapper[4860]: I0320 11:31:36.661660 4860 scope.go:117] "RemoveContainer" containerID="3692f916ebd9e82f76728a61b0840c3354adb6672a36ba82bf89b317d7536cc6" Mar 20 11:31:36 crc kubenswrapper[4860]: I0320 11:31:36.667427 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s2x6p"] Mar 20 11:31:36 crc kubenswrapper[4860]: I0320 11:31:36.681068 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s2x6p"] Mar 20 11:31:36 crc kubenswrapper[4860]: I0320 11:31:36.690381 4860 scope.go:117] "RemoveContainer" containerID="a4f90ca93d3e43497e705c4521beb4348408ab8d69ef5b2bcd7028aec3d686d5" Mar 20 11:31:37 crc kubenswrapper[4860]: I0320 11:31:37.426725 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" path="/var/lib/kubelet/pods/da2b2cab-e4d8-48ed-b198-7aff45927348/volumes" Mar 20 11:31:52 crc kubenswrapper[4860]: I0320 11:31:52.344611 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:31:52 crc kubenswrapper[4860]: I0320 11:31:52.345453 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.145385 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566772-bq77p"] Mar 20 11:32:00 crc kubenswrapper[4860]: E0320 11:32:00.146253 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="extract-utilities" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.146268 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="extract-utilities" Mar 20 11:32:00 crc kubenswrapper[4860]: E0320 11:32:00.146285 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="extract-content" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.146292 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="extract-content" Mar 20 11:32:00 crc kubenswrapper[4860]: E0320 11:32:00.146309 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="registry-server" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.146316 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="registry-server" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.146482 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2b2cab-e4d8-48ed-b198-7aff45927348" containerName="registry-server" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.147007 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-bq77p" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.157408 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-bq77p"] Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.161096 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.161392 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.161542 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.307831 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b8fx\" (UniqueName: \"kubernetes.io/projected/e22a4be9-9edf-4029-b504-f5c059318959-kube-api-access-7b8fx\") pod \"auto-csr-approver-29566772-bq77p\" (UID: \"e22a4be9-9edf-4029-b504-f5c059318959\") " pod="openshift-infra/auto-csr-approver-29566772-bq77p" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.409331 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b8fx\" (UniqueName: \"kubernetes.io/projected/e22a4be9-9edf-4029-b504-f5c059318959-kube-api-access-7b8fx\") pod \"auto-csr-approver-29566772-bq77p\" (UID: \"e22a4be9-9edf-4029-b504-f5c059318959\") " pod="openshift-infra/auto-csr-approver-29566772-bq77p" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.442692 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b8fx\" (UniqueName: \"kubernetes.io/projected/e22a4be9-9edf-4029-b504-f5c059318959-kube-api-access-7b8fx\") pod \"auto-csr-approver-29566772-bq77p\" (UID: \"e22a4be9-9edf-4029-b504-f5c059318959\") " pod="openshift-infra/auto-csr-approver-29566772-bq77p" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.469722 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-bq77p" Mar 20 11:32:00 crc kubenswrapper[4860]: I0320 11:32:00.910994 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-bq77p"] Mar 20 11:32:01 crc kubenswrapper[4860]: I0320 11:32:01.875146 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566772-bq77p" event={"ID":"e22a4be9-9edf-4029-b504-f5c059318959","Type":"ContainerStarted","Data":"83a239c60ed9ecb77830fb64d9ff1297eff5199a99cf5a3d882155d3a988e68e"} Mar 20 11:32:02 crc kubenswrapper[4860]: I0320 11:32:02.885077 4860 generic.go:334] "Generic (PLEG): container finished" podID="e22a4be9-9edf-4029-b504-f5c059318959" containerID="954a495195a9f9d931051a4c1f1eba69bbfb896f6fe4601ca7ac4a6c57e030ea" exitCode=0 Mar 20 11:32:02 crc kubenswrapper[4860]: I0320 11:32:02.885177 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566772-bq77p" event={"ID":"e22a4be9-9edf-4029-b504-f5c059318959","Type":"ContainerDied","Data":"954a495195a9f9d931051a4c1f1eba69bbfb896f6fe4601ca7ac4a6c57e030ea"} Mar 20 11:32:04 crc kubenswrapper[4860]: I0320 11:32:04.224393 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-bq77p" Mar 20 11:32:04 crc kubenswrapper[4860]: I0320 11:32:04.379636 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b8fx\" (UniqueName: \"kubernetes.io/projected/e22a4be9-9edf-4029-b504-f5c059318959-kube-api-access-7b8fx\") pod \"e22a4be9-9edf-4029-b504-f5c059318959\" (UID: \"e22a4be9-9edf-4029-b504-f5c059318959\") " Mar 20 11:32:04 crc kubenswrapper[4860]: I0320 11:32:04.389847 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22a4be9-9edf-4029-b504-f5c059318959-kube-api-access-7b8fx" (OuterVolumeSpecName: "kube-api-access-7b8fx") pod "e22a4be9-9edf-4029-b504-f5c059318959" (UID: "e22a4be9-9edf-4029-b504-f5c059318959"). InnerVolumeSpecName "kube-api-access-7b8fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:32:04 crc kubenswrapper[4860]: I0320 11:32:04.485683 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b8fx\" (UniqueName: \"kubernetes.io/projected/e22a4be9-9edf-4029-b504-f5c059318959-kube-api-access-7b8fx\") on node \"crc\" DevicePath \"\"" Mar 20 11:32:04 crc kubenswrapper[4860]: I0320 11:32:04.903588 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566772-bq77p" event={"ID":"e22a4be9-9edf-4029-b504-f5c059318959","Type":"ContainerDied","Data":"83a239c60ed9ecb77830fb64d9ff1297eff5199a99cf5a3d882155d3a988e68e"} Mar 20 11:32:04 crc kubenswrapper[4860]: I0320 11:32:04.903640 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83a239c60ed9ecb77830fb64d9ff1297eff5199a99cf5a3d882155d3a988e68e" Mar 20 11:32:04 crc kubenswrapper[4860]: I0320 11:32:04.903684 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-bq77p" Mar 20 11:32:05 crc kubenswrapper[4860]: I0320 11:32:05.304416 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-fhwnn"] Mar 20 11:32:05 crc kubenswrapper[4860]: I0320 11:32:05.311497 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-fhwnn"] Mar 20 11:32:05 crc kubenswrapper[4860]: I0320 11:32:05.423708 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a62a673-bde2-4cf8-bae1-56252a15c71e" path="/var/lib/kubelet/pods/8a62a673-bde2-4cf8-bae1-56252a15c71e/volumes" Mar 20 11:32:22 crc kubenswrapper[4860]: I0320 11:32:22.344723 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:32:22 crc kubenswrapper[4860]: I0320 11:32:22.345650 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.159521 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h8mrm"] Mar 20 11:32:47 crc kubenswrapper[4860]: E0320 11:32:47.162076 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22a4be9-9edf-4029-b504-f5c059318959" containerName="oc" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.162189 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22a4be9-9edf-4029-b504-f5c059318959" containerName="oc" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.162468 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22a4be9-9edf-4029-b504-f5c059318959" containerName="oc" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.163966 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.166502 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h8mrm"] Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.211750 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-catalog-content\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.211943 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b796\" (UniqueName: \"kubernetes.io/projected/2de32a4d-295e-4e53-9224-445137c28938-kube-api-access-6b796\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.212019 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-utilities\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.314115 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b796\" (UniqueName: \"kubernetes.io/projected/2de32a4d-295e-4e53-9224-445137c28938-kube-api-access-6b796\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.314198 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-utilities\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.314258 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-catalog-content\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.315493 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-catalog-content\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.315698 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-utilities\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.337864 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b796\" (UniqueName: \"kubernetes.io/projected/2de32a4d-295e-4e53-9224-445137c28938-kube-api-access-6b796\") pod \"redhat-operators-h8mrm\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:47 crc kubenswrapper[4860]: I0320 11:32:47.488366 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:48 crc kubenswrapper[4860]: I0320 11:32:48.565305 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h8mrm"] Mar 20 11:32:48 crc kubenswrapper[4860]: I0320 11:32:48.679808 4860 scope.go:117] "RemoveContainer" containerID="c8ce87256d0115d4f80012e1ce15b95a524f1b59f29da3d0c9299a0f68d2780e" Mar 20 11:32:49 crc kubenswrapper[4860]: I0320 11:32:49.265543 4860 generic.go:334] "Generic (PLEG): container finished" podID="2de32a4d-295e-4e53-9224-445137c28938" containerID="e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be" exitCode=0 Mar 20 11:32:49 crc kubenswrapper[4860]: I0320 11:32:49.265591 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8mrm" event={"ID":"2de32a4d-295e-4e53-9224-445137c28938","Type":"ContainerDied","Data":"e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be"} Mar 20 11:32:49 crc kubenswrapper[4860]: I0320 11:32:49.265642 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8mrm" event={"ID":"2de32a4d-295e-4e53-9224-445137c28938","Type":"ContainerStarted","Data":"84a0d03260848fa62a51d1fe5b7dc42a764bbd619dfd6c2877c6efe12414a0da"} Mar 20 11:32:50 crc kubenswrapper[4860]: I0320 11:32:50.276674 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8mrm" event={"ID":"2de32a4d-295e-4e53-9224-445137c28938","Type":"ContainerStarted","Data":"e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2"} Mar 20 11:32:52 crc kubenswrapper[4860]: I0320 11:32:52.387358 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:32:52 crc kubenswrapper[4860]: I0320 11:32:52.388341 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:32:52 crc kubenswrapper[4860]: I0320 11:32:52.388818 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:32:52 crc kubenswrapper[4860]: I0320 11:32:52.389535 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:32:52 crc kubenswrapper[4860]: I0320 11:32:52.389623 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" gracePeriod=600 Mar 20 11:32:52 crc kubenswrapper[4860]: E0320 11:32:52.534516 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:32:53 crc kubenswrapper[4860]: I0320 11:32:53.410115 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" exitCode=0 Mar 20 11:32:53 crc kubenswrapper[4860]: I0320 11:32:53.410771 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67"} Mar 20 11:32:53 crc kubenswrapper[4860]: I0320 11:32:53.410897 4860 scope.go:117] "RemoveContainer" containerID="b5a83b0edf3d114fb0c90b2b89dbd92039d6760397381d08a497f8b893a5686e" Mar 20 11:32:53 crc kubenswrapper[4860]: I0320 11:32:53.411696 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:32:53 crc kubenswrapper[4860]: E0320 11:32:53.412029 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:32:54 crc kubenswrapper[4860]: I0320 11:32:54.423927 4860 generic.go:334] "Generic (PLEG): container finished" podID="2de32a4d-295e-4e53-9224-445137c28938" containerID="e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2" exitCode=0 Mar 20 11:32:54 crc kubenswrapper[4860]: I0320 11:32:54.423990 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8mrm" event={"ID":"2de32a4d-295e-4e53-9224-445137c28938","Type":"ContainerDied","Data":"e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2"} Mar 20 11:32:55 crc kubenswrapper[4860]: I0320 11:32:55.451574 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8mrm" event={"ID":"2de32a4d-295e-4e53-9224-445137c28938","Type":"ContainerStarted","Data":"c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606"} Mar 20 11:32:55 crc kubenswrapper[4860]: I0320 11:32:55.481700 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h8mrm" podStartSLOduration=2.907645797 podStartE2EDuration="8.481675437s" podCreationTimestamp="2026-03-20 11:32:47 +0000 UTC" firstStartedPulling="2026-03-20 11:32:49.267839099 +0000 UTC m=+2293.489199997" lastFinishedPulling="2026-03-20 11:32:54.841868709 +0000 UTC m=+2299.063229637" observedRunningTime="2026-03-20 11:32:55.480245898 +0000 UTC m=+2299.701606796" watchObservedRunningTime="2026-03-20 11:32:55.481675437 +0000 UTC m=+2299.703036335" Mar 20 11:32:57 crc kubenswrapper[4860]: I0320 11:32:57.695818 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:57 crc kubenswrapper[4860]: I0320 11:32:57.697193 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:32:58 crc kubenswrapper[4860]: I0320 11:32:58.726378 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h8mrm" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="registry-server" probeResult="failure" output=< Mar 20 11:32:58 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 11:32:58 crc kubenswrapper[4860]: > Mar 20 11:33:07 crc kubenswrapper[4860]: I0320 11:33:07.418857 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:33:07 crc kubenswrapper[4860]: E0320 11:33:07.420034 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:33:07 crc kubenswrapper[4860]: I0320 11:33:07.536628 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:33:07 crc kubenswrapper[4860]: I0320 11:33:07.583523 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:33:07 crc kubenswrapper[4860]: I0320 11:33:07.777508 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h8mrm"] Mar 20 11:33:08 crc kubenswrapper[4860]: I0320 11:33:08.794538 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h8mrm" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="registry-server" containerID="cri-o://c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606" gracePeriod=2 Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.193135 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.304951 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-utilities\") pod \"2de32a4d-295e-4e53-9224-445137c28938\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.305171 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-catalog-content\") pod \"2de32a4d-295e-4e53-9224-445137c28938\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.305346 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b796\" (UniqueName: \"kubernetes.io/projected/2de32a4d-295e-4e53-9224-445137c28938-kube-api-access-6b796\") pod \"2de32a4d-295e-4e53-9224-445137c28938\" (UID: \"2de32a4d-295e-4e53-9224-445137c28938\") " Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.308142 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-utilities" (OuterVolumeSpecName: "utilities") pod "2de32a4d-295e-4e53-9224-445137c28938" (UID: "2de32a4d-295e-4e53-9224-445137c28938"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.313774 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de32a4d-295e-4e53-9224-445137c28938-kube-api-access-6b796" (OuterVolumeSpecName: "kube-api-access-6b796") pod "2de32a4d-295e-4e53-9224-445137c28938" (UID: "2de32a4d-295e-4e53-9224-445137c28938"). InnerVolumeSpecName "kube-api-access-6b796". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.407700 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b796\" (UniqueName: \"kubernetes.io/projected/2de32a4d-295e-4e53-9224-445137c28938-kube-api-access-6b796\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.407746 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.438784 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2de32a4d-295e-4e53-9224-445137c28938" (UID: "2de32a4d-295e-4e53-9224-445137c28938"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.509849 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de32a4d-295e-4e53-9224-445137c28938-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.806037 4860 generic.go:334] "Generic (PLEG): container finished" podID="2de32a4d-295e-4e53-9224-445137c28938" containerID="c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606" exitCode=0 Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.806104 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8mrm" event={"ID":"2de32a4d-295e-4e53-9224-445137c28938","Type":"ContainerDied","Data":"c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606"} Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.806115 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8mrm" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.806149 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8mrm" event={"ID":"2de32a4d-295e-4e53-9224-445137c28938","Type":"ContainerDied","Data":"84a0d03260848fa62a51d1fe5b7dc42a764bbd619dfd6c2877c6efe12414a0da"} Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.806179 4860 scope.go:117] "RemoveContainer" containerID="c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.829805 4860 scope.go:117] "RemoveContainer" containerID="e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.849216 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h8mrm"] Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.854823 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h8mrm"] Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.877345 4860 scope.go:117] "RemoveContainer" containerID="e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.897381 4860 scope.go:117] "RemoveContainer" containerID="c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606" Mar 20 11:33:09 crc kubenswrapper[4860]: E0320 11:33:09.898109 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606\": container with ID starting with c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606 not found: ID does not exist" containerID="c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.898150 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606"} err="failed to get container status \"c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606\": rpc error: code = NotFound desc = could not find container \"c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606\": container with ID starting with c313902cf719e8c3d691f2bb4c13d13ceada33e566d455668762f62c1a440606 not found: ID does not exist" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.898182 4860 scope.go:117] "RemoveContainer" containerID="e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2" Mar 20 11:33:09 crc kubenswrapper[4860]: E0320 11:33:09.898891 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2\": container with ID starting with e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2 not found: ID does not exist" containerID="e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.898946 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2"} err="failed to get container status \"e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2\": rpc error: code = NotFound desc = could not find container \"e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2\": container with ID starting with e840894dd4faef1c4efdefff7e91fad1e5c6f6dbc5fd5e68fc6a294c9ad64bf2 not found: ID does not exist" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.898986 4860 scope.go:117] "RemoveContainer" containerID="e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be" Mar 20 11:33:09 crc kubenswrapper[4860]: E0320 11:33:09.899733 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be\": container with ID starting with e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be not found: ID does not exist" containerID="e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be" Mar 20 11:33:09 crc kubenswrapper[4860]: I0320 11:33:09.899806 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be"} err="failed to get container status \"e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be\": rpc error: code = NotFound desc = could not find container \"e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be\": container with ID starting with e2395973e064152259253eb58c0c6a0446f283f6f52bc2a04815836f797ae9be not found: ID does not exist" Mar 20 11:33:11 crc kubenswrapper[4860]: I0320 11:33:11.422955 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de32a4d-295e-4e53-9224-445137c28938" path="/var/lib/kubelet/pods/2de32a4d-295e-4e53-9224-445137c28938/volumes" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.587386 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xphpt"] Mar 20 11:33:19 crc kubenswrapper[4860]: E0320 11:33:19.588712 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="extract-content" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.588734 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="extract-content" Mar 20 11:33:19 crc kubenswrapper[4860]: E0320 11:33:19.588764 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="extract-utilities" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.588770 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="extract-utilities" Mar 20 11:33:19 crc kubenswrapper[4860]: E0320 11:33:19.588787 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="registry-server" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.588792 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="registry-server" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.588927 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de32a4d-295e-4e53-9224-445137c28938" containerName="registry-server" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.590169 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.627635 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xphpt"] Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.676523 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-catalog-content\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.677374 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlv8l\" (UniqueName: \"kubernetes.io/projected/ca22abec-1b58-4b4d-a3a8-0744e4684074-kube-api-access-hlv8l\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.677494 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-utilities\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.779053 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlv8l\" (UniqueName: \"kubernetes.io/projected/ca22abec-1b58-4b4d-a3a8-0744e4684074-kube-api-access-hlv8l\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.779102 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-utilities\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.779183 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-catalog-content\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.779693 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-catalog-content\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.780166 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-utilities\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.801913 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlv8l\" (UniqueName: \"kubernetes.io/projected/ca22abec-1b58-4b4d-a3a8-0744e4684074-kube-api-access-hlv8l\") pod \"certified-operators-xphpt\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:19 crc kubenswrapper[4860]: I0320 11:33:19.918539 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:20 crc kubenswrapper[4860]: I0320 11:33:20.457933 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xphpt"] Mar 20 11:33:20 crc kubenswrapper[4860]: I0320 11:33:20.889105 4860 generic.go:334] "Generic (PLEG): container finished" podID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerID="7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e" exitCode=0 Mar 20 11:33:20 crc kubenswrapper[4860]: I0320 11:33:20.890257 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xphpt" event={"ID":"ca22abec-1b58-4b4d-a3a8-0744e4684074","Type":"ContainerDied","Data":"7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e"} Mar 20 11:33:20 crc kubenswrapper[4860]: I0320 11:33:20.890413 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xphpt" event={"ID":"ca22abec-1b58-4b4d-a3a8-0744e4684074","Type":"ContainerStarted","Data":"983ed4b1339ec96d4ba41f7871568d90c8c49c98f3f71ee320948724affbf5ff"} Mar 20 11:33:20 crc kubenswrapper[4860]: I0320 11:33:20.891640 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:33:21 crc kubenswrapper[4860]: I0320 11:33:21.898200 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xphpt" event={"ID":"ca22abec-1b58-4b4d-a3a8-0744e4684074","Type":"ContainerStarted","Data":"6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53"} Mar 20 11:33:22 crc kubenswrapper[4860]: I0320 11:33:22.413358 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:33:22 crc kubenswrapper[4860]: E0320 11:33:22.414188 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:33:22 crc kubenswrapper[4860]: I0320 11:33:22.907271 4860 generic.go:334] "Generic (PLEG): container finished" podID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerID="6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53" exitCode=0 Mar 20 11:33:22 crc kubenswrapper[4860]: I0320 11:33:22.907340 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xphpt" event={"ID":"ca22abec-1b58-4b4d-a3a8-0744e4684074","Type":"ContainerDied","Data":"6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53"} Mar 20 11:33:23 crc kubenswrapper[4860]: I0320 11:33:23.916426 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xphpt" event={"ID":"ca22abec-1b58-4b4d-a3a8-0744e4684074","Type":"ContainerStarted","Data":"fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08"} Mar 20 11:33:23 crc kubenswrapper[4860]: I0320 11:33:23.944540 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xphpt" podStartSLOduration=2.486026149 podStartE2EDuration="4.944515409s" podCreationTimestamp="2026-03-20 11:33:19 +0000 UTC" firstStartedPulling="2026-03-20 11:33:20.891374548 +0000 UTC m=+2325.112735446" lastFinishedPulling="2026-03-20 11:33:23.349863798 +0000 UTC m=+2327.571224706" observedRunningTime="2026-03-20 11:33:23.938904678 +0000 UTC m=+2328.160265586" watchObservedRunningTime="2026-03-20 11:33:23.944515409 +0000 UTC m=+2328.165876307" Mar 20 11:33:29 crc kubenswrapper[4860]: I0320 11:33:29.919927 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:29 crc kubenswrapper[4860]: I0320 11:33:29.921064 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:29 crc kubenswrapper[4860]: I0320 11:33:29.978212 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:30 crc kubenswrapper[4860]: I0320 11:33:30.036718 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.036804 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2qmpt"] Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.040134 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.055618 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qmpt"] Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.073582 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-catalog-content\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.073680 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qqkt\" (UniqueName: \"kubernetes.io/projected/b9a43112-1781-421d-9123-971f77f6739e-kube-api-access-4qqkt\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.073736 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-utilities\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.175664 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qqkt\" (UniqueName: \"kubernetes.io/projected/b9a43112-1781-421d-9123-971f77f6739e-kube-api-access-4qqkt\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.175725 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-utilities\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.175788 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-catalog-content\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.176412 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-catalog-content\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.176638 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-utilities\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.206325 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qqkt\" (UniqueName: \"kubernetes.io/projected/b9a43112-1781-421d-9123-971f77f6739e-kube-api-access-4qqkt\") pod \"redhat-marketplace-2qmpt\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.382930 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.839483 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qmpt"] Mar 20 11:33:32 crc kubenswrapper[4860]: I0320 11:33:32.988440 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qmpt" event={"ID":"b9a43112-1781-421d-9123-971f77f6739e","Type":"ContainerStarted","Data":"ace6e6791128e2eaff68c3272eb50bcf3fe0a2d8d4ae2e160cf51cf12052e981"} Mar 20 11:33:33 crc kubenswrapper[4860]: I0320 11:33:33.997642 4860 generic.go:334] "Generic (PLEG): container finished" podID="b9a43112-1781-421d-9123-971f77f6739e" containerID="eb7ee69cc011a4baac33cf78d514007cd666797e1e2e361a8be5ebf8b7e4c5e8" exitCode=0 Mar 20 11:33:33 crc kubenswrapper[4860]: I0320 11:33:33.998115 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qmpt" event={"ID":"b9a43112-1781-421d-9123-971f77f6739e","Type":"ContainerDied","Data":"eb7ee69cc011a4baac33cf78d514007cd666797e1e2e361a8be5ebf8b7e4c5e8"} Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.008944 4860 generic.go:334] "Generic (PLEG): container finished" podID="b9a43112-1781-421d-9123-971f77f6739e" containerID="e11794d01c2a317e0d5de1c29a1335434b1ac423e812be9a342abf8366eddb63" exitCode=0 Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.009016 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qmpt" event={"ID":"b9a43112-1781-421d-9123-971f77f6739e","Type":"ContainerDied","Data":"e11794d01c2a317e0d5de1c29a1335434b1ac423e812be9a342abf8366eddb63"} Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.224101 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xphpt"] Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.224419 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xphpt" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="registry-server" containerID="cri-o://fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08" gracePeriod=2 Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.681866 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.740493 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-utilities\") pod \"ca22abec-1b58-4b4d-a3a8-0744e4684074\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.740571 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlv8l\" (UniqueName: \"kubernetes.io/projected/ca22abec-1b58-4b4d-a3a8-0744e4684074-kube-api-access-hlv8l\") pod \"ca22abec-1b58-4b4d-a3a8-0744e4684074\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.740839 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-catalog-content\") pod \"ca22abec-1b58-4b4d-a3a8-0744e4684074\" (UID: \"ca22abec-1b58-4b4d-a3a8-0744e4684074\") " Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.741600 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-utilities" (OuterVolumeSpecName: "utilities") pod "ca22abec-1b58-4b4d-a3a8-0744e4684074" (UID: "ca22abec-1b58-4b4d-a3a8-0744e4684074"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.748271 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca22abec-1b58-4b4d-a3a8-0744e4684074-kube-api-access-hlv8l" (OuterVolumeSpecName: "kube-api-access-hlv8l") pod "ca22abec-1b58-4b4d-a3a8-0744e4684074" (UID: "ca22abec-1b58-4b4d-a3a8-0744e4684074"). InnerVolumeSpecName "kube-api-access-hlv8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.804982 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca22abec-1b58-4b4d-a3a8-0744e4684074" (UID: "ca22abec-1b58-4b4d-a3a8-0744e4684074"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.843473 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.843526 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlv8l\" (UniqueName: \"kubernetes.io/projected/ca22abec-1b58-4b4d-a3a8-0744e4684074-kube-api-access-hlv8l\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:35 crc kubenswrapper[4860]: I0320 11:33:35.843540 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca22abec-1b58-4b4d-a3a8-0744e4684074-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.021299 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qmpt" event={"ID":"b9a43112-1781-421d-9123-971f77f6739e","Type":"ContainerStarted","Data":"e1469ca050f7c90afc00f35e26c3db8fafa10d139777ea4fb8d3af6cacc36d2c"} Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.025000 4860 generic.go:334] "Generic (PLEG): container finished" podID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerID="fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08" exitCode=0 Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.025093 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xphpt" event={"ID":"ca22abec-1b58-4b4d-a3a8-0744e4684074","Type":"ContainerDied","Data":"fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08"} Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.025168 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xphpt" event={"ID":"ca22abec-1b58-4b4d-a3a8-0744e4684074","Type":"ContainerDied","Data":"983ed4b1339ec96d4ba41f7871568d90c8c49c98f3f71ee320948724affbf5ff"} Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.025195 4860 scope.go:117] "RemoveContainer" containerID="fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.025121 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xphpt" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.053569 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2qmpt" podStartSLOduration=2.445158218 podStartE2EDuration="4.053548217s" podCreationTimestamp="2026-03-20 11:33:32 +0000 UTC" firstStartedPulling="2026-03-20 11:33:34.002582301 +0000 UTC m=+2338.223943199" lastFinishedPulling="2026-03-20 11:33:35.6109723 +0000 UTC m=+2339.832333198" observedRunningTime="2026-03-20 11:33:36.048645525 +0000 UTC m=+2340.270006423" watchObservedRunningTime="2026-03-20 11:33:36.053548217 +0000 UTC m=+2340.274909115" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.055427 4860 scope.go:117] "RemoveContainer" containerID="6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.072207 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xphpt"] Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.081572 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xphpt"] Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.109588 4860 scope.go:117] "RemoveContainer" containerID="7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.132550 4860 scope.go:117] "RemoveContainer" containerID="fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08" Mar 20 11:33:36 crc kubenswrapper[4860]: E0320 11:33:36.133140 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08\": container with ID starting with fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08 not found: ID does not exist" containerID="fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.133194 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08"} err="failed to get container status \"fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08\": rpc error: code = NotFound desc = could not find container \"fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08\": container with ID starting with fb68db13dd5693e8a8378284b4da077c66560aeac4b3effe34ac00faa66c6a08 not found: ID does not exist" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.133251 4860 scope.go:117] "RemoveContainer" containerID="6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53" Mar 20 11:33:36 crc kubenswrapper[4860]: E0320 11:33:36.135392 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53\": container with ID starting with 6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53 not found: ID does not exist" containerID="6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.135429 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53"} err="failed to get container status \"6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53\": rpc error: code = NotFound desc = could not find container \"6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53\": container with ID starting with 6f9b3d8db31b52d3c68c31c34984b6ae9a0c8f71daa6f9d5f0f826f06de3fd53 not found: ID does not exist" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.135458 4860 scope.go:117] "RemoveContainer" containerID="7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e" Mar 20 11:33:36 crc kubenswrapper[4860]: E0320 11:33:36.135996 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e\": container with ID starting with 7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e not found: ID does not exist" containerID="7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e" Mar 20 11:33:36 crc kubenswrapper[4860]: I0320 11:33:36.136023 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e"} err="failed to get container status \"7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e\": rpc error: code = NotFound desc = could not find container \"7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e\": container with ID starting with 7b2b611de9e8ecef269217ade01c2b26e99a9b22cd2ba20b7dc2cc24557abe0e not found: ID does not exist" Mar 20 11:33:37 crc kubenswrapper[4860]: I0320 11:33:37.418633 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:33:37 crc kubenswrapper[4860]: E0320 11:33:37.419513 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:33:37 crc kubenswrapper[4860]: I0320 11:33:37.424205 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" path="/var/lib/kubelet/pods/ca22abec-1b58-4b4d-a3a8-0744e4684074/volumes" Mar 20 11:33:42 crc kubenswrapper[4860]: I0320 11:33:42.383398 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:42 crc kubenswrapper[4860]: I0320 11:33:42.385416 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:42 crc kubenswrapper[4860]: I0320 11:33:42.448499 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:43 crc kubenswrapper[4860]: I0320 11:33:43.138832 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:43 crc kubenswrapper[4860]: I0320 11:33:43.197539 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qmpt"] Mar 20 11:33:45 crc kubenswrapper[4860]: I0320 11:33:45.103530 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2qmpt" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="registry-server" containerID="cri-o://e1469ca050f7c90afc00f35e26c3db8fafa10d139777ea4fb8d3af6cacc36d2c" gracePeriod=2 Mar 20 11:33:46 crc kubenswrapper[4860]: I0320 11:33:46.737006 4860 generic.go:334] "Generic (PLEG): container finished" podID="b9a43112-1781-421d-9123-971f77f6739e" containerID="e1469ca050f7c90afc00f35e26c3db8fafa10d139777ea4fb8d3af6cacc36d2c" exitCode=0 Mar 20 11:33:46 crc kubenswrapper[4860]: I0320 11:33:46.737492 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qmpt" event={"ID":"b9a43112-1781-421d-9123-971f77f6739e","Type":"ContainerDied","Data":"e1469ca050f7c90afc00f35e26c3db8fafa10d139777ea4fb8d3af6cacc36d2c"} Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.010301 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.189446 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-utilities\") pod \"b9a43112-1781-421d-9123-971f77f6739e\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.189685 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-catalog-content\") pod \"b9a43112-1781-421d-9123-971f77f6739e\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.189727 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qqkt\" (UniqueName: \"kubernetes.io/projected/b9a43112-1781-421d-9123-971f77f6739e-kube-api-access-4qqkt\") pod \"b9a43112-1781-421d-9123-971f77f6739e\" (UID: \"b9a43112-1781-421d-9123-971f77f6739e\") " Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.191256 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-utilities" (OuterVolumeSpecName: "utilities") pod "b9a43112-1781-421d-9123-971f77f6739e" (UID: "b9a43112-1781-421d-9123-971f77f6739e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.202065 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9a43112-1781-421d-9123-971f77f6739e-kube-api-access-4qqkt" (OuterVolumeSpecName: "kube-api-access-4qqkt") pod "b9a43112-1781-421d-9123-971f77f6739e" (UID: "b9a43112-1781-421d-9123-971f77f6739e"). InnerVolumeSpecName "kube-api-access-4qqkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.231443 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9a43112-1781-421d-9123-971f77f6739e" (UID: "b9a43112-1781-421d-9123-971f77f6739e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.292116 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.292152 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qqkt\" (UniqueName: \"kubernetes.io/projected/b9a43112-1781-421d-9123-971f77f6739e-kube-api-access-4qqkt\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.292165 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a43112-1781-421d-9123-971f77f6739e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.753124 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qmpt" event={"ID":"b9a43112-1781-421d-9123-971f77f6739e","Type":"ContainerDied","Data":"ace6e6791128e2eaff68c3272eb50bcf3fe0a2d8d4ae2e160cf51cf12052e981"} Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.753240 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qmpt" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.753595 4860 scope.go:117] "RemoveContainer" containerID="e1469ca050f7c90afc00f35e26c3db8fafa10d139777ea4fb8d3af6cacc36d2c" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.780806 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qmpt"] Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.789631 4860 scope.go:117] "RemoveContainer" containerID="e11794d01c2a317e0d5de1c29a1335434b1ac423e812be9a342abf8366eddb63" Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.794362 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qmpt"] Mar 20 11:33:47 crc kubenswrapper[4860]: I0320 11:33:47.811494 4860 scope.go:117] "RemoveContainer" containerID="eb7ee69cc011a4baac33cf78d514007cd666797e1e2e361a8be5ebf8b7e4c5e8" Mar 20 11:33:49 crc kubenswrapper[4860]: I0320 11:33:49.416154 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:33:49 crc kubenswrapper[4860]: E0320 11:33:49.416876 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:33:49 crc kubenswrapper[4860]: I0320 11:33:49.427912 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9a43112-1781-421d-9123-971f77f6739e" path="/var/lib/kubelet/pods/b9a43112-1781-421d-9123-971f77f6739e/volumes" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.148797 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566774-jmb8l"] Mar 20 11:34:00 crc kubenswrapper[4860]: E0320 11:34:00.149823 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="extract-content" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.149840 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="extract-content" Mar 20 11:34:00 crc kubenswrapper[4860]: E0320 11:34:00.149858 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="extract-content" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.149865 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="extract-content" Mar 20 11:34:00 crc kubenswrapper[4860]: E0320 11:34:00.149887 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.149895 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4860]: E0320 11:34:00.149907 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="extract-utilities" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.149914 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="extract-utilities" Mar 20 11:34:00 crc kubenswrapper[4860]: E0320 11:34:00.149936 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="extract-utilities" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.149943 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="extract-utilities" Mar 20 11:34:00 crc kubenswrapper[4860]: E0320 11:34:00.149958 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.149965 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.150119 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca22abec-1b58-4b4d-a3a8-0744e4684074" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.150140 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9a43112-1781-421d-9123-971f77f6739e" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.150836 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-jmb8l" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.157765 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-jmb8l"] Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.160284 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.160573 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.160793 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.306170 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd248\" (UniqueName: \"kubernetes.io/projected/34154cfb-cf91-40a0-8390-78823b11b698-kube-api-access-dd248\") pod \"auto-csr-approver-29566774-jmb8l\" (UID: \"34154cfb-cf91-40a0-8390-78823b11b698\") " pod="openshift-infra/auto-csr-approver-29566774-jmb8l" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.408411 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd248\" (UniqueName: \"kubernetes.io/projected/34154cfb-cf91-40a0-8390-78823b11b698-kube-api-access-dd248\") pod \"auto-csr-approver-29566774-jmb8l\" (UID: \"34154cfb-cf91-40a0-8390-78823b11b698\") " pod="openshift-infra/auto-csr-approver-29566774-jmb8l" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.431711 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd248\" (UniqueName: \"kubernetes.io/projected/34154cfb-cf91-40a0-8390-78823b11b698-kube-api-access-dd248\") pod \"auto-csr-approver-29566774-jmb8l\" (UID: \"34154cfb-cf91-40a0-8390-78823b11b698\") " pod="openshift-infra/auto-csr-approver-29566774-jmb8l" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.483138 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-jmb8l" Mar 20 11:34:00 crc kubenswrapper[4860]: I0320 11:34:00.926274 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-jmb8l"] Mar 20 11:34:01 crc kubenswrapper[4860]: I0320 11:34:01.414402 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:34:01 crc kubenswrapper[4860]: E0320 11:34:01.414694 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:34:01 crc kubenswrapper[4860]: I0320 11:34:01.876861 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566774-jmb8l" event={"ID":"34154cfb-cf91-40a0-8390-78823b11b698","Type":"ContainerStarted","Data":"aa5e784f19c7560daee1c9bbf661ef9378c178b55f0eab3f7d66816056d92381"} Mar 20 11:34:02 crc kubenswrapper[4860]: I0320 11:34:02.886462 4860 generic.go:334] "Generic (PLEG): container finished" podID="34154cfb-cf91-40a0-8390-78823b11b698" containerID="bd1a330fd37e3c9d04a847aabc8c53a75648f5f571a68d378c964de8d51bbab7" exitCode=0 Mar 20 11:34:02 crc kubenswrapper[4860]: I0320 11:34:02.886565 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566774-jmb8l" event={"ID":"34154cfb-cf91-40a0-8390-78823b11b698","Type":"ContainerDied","Data":"bd1a330fd37e3c9d04a847aabc8c53a75648f5f571a68d378c964de8d51bbab7"} Mar 20 11:34:04 crc kubenswrapper[4860]: I0320 11:34:04.174440 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-jmb8l" Mar 20 11:34:04 crc kubenswrapper[4860]: I0320 11:34:04.273192 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd248\" (UniqueName: \"kubernetes.io/projected/34154cfb-cf91-40a0-8390-78823b11b698-kube-api-access-dd248\") pod \"34154cfb-cf91-40a0-8390-78823b11b698\" (UID: \"34154cfb-cf91-40a0-8390-78823b11b698\") " Mar 20 11:34:04 crc kubenswrapper[4860]: I0320 11:34:04.280702 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34154cfb-cf91-40a0-8390-78823b11b698-kube-api-access-dd248" (OuterVolumeSpecName: "kube-api-access-dd248") pod "34154cfb-cf91-40a0-8390-78823b11b698" (UID: "34154cfb-cf91-40a0-8390-78823b11b698"). InnerVolumeSpecName "kube-api-access-dd248". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:34:04 crc kubenswrapper[4860]: I0320 11:34:04.377200 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd248\" (UniqueName: \"kubernetes.io/projected/34154cfb-cf91-40a0-8390-78823b11b698-kube-api-access-dd248\") on node \"crc\" DevicePath \"\"" Mar 20 11:34:04 crc kubenswrapper[4860]: I0320 11:34:04.902869 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566774-jmb8l" event={"ID":"34154cfb-cf91-40a0-8390-78823b11b698","Type":"ContainerDied","Data":"aa5e784f19c7560daee1c9bbf661ef9378c178b55f0eab3f7d66816056d92381"} Mar 20 11:34:04 crc kubenswrapper[4860]: I0320 11:34:04.903413 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa5e784f19c7560daee1c9bbf661ef9378c178b55f0eab3f7d66816056d92381" Mar 20 11:34:04 crc kubenswrapper[4860]: I0320 11:34:04.902934 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-jmb8l" Mar 20 11:34:05 crc kubenswrapper[4860]: I0320 11:34:05.265574 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-r8jtv"] Mar 20 11:34:05 crc kubenswrapper[4860]: I0320 11:34:05.265679 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-r8jtv"] Mar 20 11:34:05 crc kubenswrapper[4860]: I0320 11:34:05.425043 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2" path="/var/lib/kubelet/pods/e7b6bebe-e27d-4ca9-9b8b-3f6ba51b8cd2/volumes" Mar 20 11:34:13 crc kubenswrapper[4860]: I0320 11:34:13.414061 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:34:13 crc kubenswrapper[4860]: E0320 11:34:13.414997 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:34:26 crc kubenswrapper[4860]: I0320 11:34:26.414334 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:34:26 crc kubenswrapper[4860]: E0320 11:34:26.415388 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:34:39 crc kubenswrapper[4860]: I0320 11:34:39.413736 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:34:39 crc kubenswrapper[4860]: E0320 11:34:39.414878 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:34:48 crc kubenswrapper[4860]: I0320 11:34:48.803950 4860 scope.go:117] "RemoveContainer" containerID="43beffcf03fad4254e2bb3cab94aa4a32cf894399c7d22a72063beb87aa2fc0f" Mar 20 11:34:54 crc kubenswrapper[4860]: I0320 11:34:54.415181 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:34:54 crc kubenswrapper[4860]: E0320 11:34:54.416027 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:35:08 crc kubenswrapper[4860]: I0320 11:35:08.413944 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:35:08 crc kubenswrapper[4860]: E0320 11:35:08.415754 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:35:22 crc kubenswrapper[4860]: I0320 11:35:22.414209 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:35:22 crc kubenswrapper[4860]: E0320 11:35:22.416163 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:35:34 crc kubenswrapper[4860]: I0320 11:35:34.413297 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:35:34 crc kubenswrapper[4860]: E0320 11:35:34.414456 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:35:45 crc kubenswrapper[4860]: I0320 11:35:45.413989 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:35:45 crc kubenswrapper[4860]: E0320 11:35:45.415161 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:35:57 crc kubenswrapper[4860]: I0320 11:35:57.418143 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:35:57 crc kubenswrapper[4860]: E0320 11:35:57.419408 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.145910 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566776-6j7wf"] Mar 20 11:36:00 crc kubenswrapper[4860]: E0320 11:36:00.146411 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34154cfb-cf91-40a0-8390-78823b11b698" containerName="oc" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.146430 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="34154cfb-cf91-40a0-8390-78823b11b698" containerName="oc" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.146642 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="34154cfb-cf91-40a0-8390-78823b11b698" containerName="oc" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.147416 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.150319 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.150623 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.150952 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.157522 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-6j7wf"] Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.217036 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l9r2\" (UniqueName: \"kubernetes.io/projected/fd9943d6-a840-4ba5-b12c-9ebf3cbd1224-kube-api-access-2l9r2\") pod \"auto-csr-approver-29566776-6j7wf\" (UID: \"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224\") " pod="openshift-infra/auto-csr-approver-29566776-6j7wf" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.319028 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l9r2\" (UniqueName: \"kubernetes.io/projected/fd9943d6-a840-4ba5-b12c-9ebf3cbd1224-kube-api-access-2l9r2\") pod \"auto-csr-approver-29566776-6j7wf\" (UID: \"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224\") " pod="openshift-infra/auto-csr-approver-29566776-6j7wf" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.352093 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l9r2\" (UniqueName: \"kubernetes.io/projected/fd9943d6-a840-4ba5-b12c-9ebf3cbd1224-kube-api-access-2l9r2\") pod \"auto-csr-approver-29566776-6j7wf\" (UID: \"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224\") " pod="openshift-infra/auto-csr-approver-29566776-6j7wf" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.477722 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.915729 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-6j7wf"] Mar 20 11:36:00 crc kubenswrapper[4860]: W0320 11:36:00.920043 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd9943d6_a840_4ba5_b12c_9ebf3cbd1224.slice/crio-4e00d96b74b5759c3798b31ab9f818aa77b10c25b8d17db54a8dd93bdd8425a4 WatchSource:0}: Error finding container 4e00d96b74b5759c3798b31ab9f818aa77b10c25b8d17db54a8dd93bdd8425a4: Status 404 returned error can't find the container with id 4e00d96b74b5759c3798b31ab9f818aa77b10c25b8d17db54a8dd93bdd8425a4 Mar 20 11:36:00 crc kubenswrapper[4860]: I0320 11:36:00.941702 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" event={"ID":"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224","Type":"ContainerStarted","Data":"4e00d96b74b5759c3798b31ab9f818aa77b10c25b8d17db54a8dd93bdd8425a4"} Mar 20 11:36:02 crc kubenswrapper[4860]: I0320 11:36:02.975006 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" event={"ID":"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224","Type":"ContainerStarted","Data":"d289c9e2018c59ba8e3c1ee7b7a591cd4418a9530b3b771d0a13873b671d4bdf"} Mar 20 11:36:03 crc kubenswrapper[4860]: I0320 11:36:03.983810 4860 generic.go:334] "Generic (PLEG): container finished" podID="fd9943d6-a840-4ba5-b12c-9ebf3cbd1224" containerID="d289c9e2018c59ba8e3c1ee7b7a591cd4418a9530b3b771d0a13873b671d4bdf" exitCode=0 Mar 20 11:36:03 crc kubenswrapper[4860]: I0320 11:36:03.983873 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" event={"ID":"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224","Type":"ContainerDied","Data":"d289c9e2018c59ba8e3c1ee7b7a591cd4418a9530b3b771d0a13873b671d4bdf"} Mar 20 11:36:04 crc kubenswrapper[4860]: I0320 11:36:04.315088 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" Mar 20 11:36:04 crc kubenswrapper[4860]: I0320 11:36:04.490948 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l9r2\" (UniqueName: \"kubernetes.io/projected/fd9943d6-a840-4ba5-b12c-9ebf3cbd1224-kube-api-access-2l9r2\") pod \"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224\" (UID: \"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224\") " Mar 20 11:36:04 crc kubenswrapper[4860]: I0320 11:36:04.497870 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd9943d6-a840-4ba5-b12c-9ebf3cbd1224-kube-api-access-2l9r2" (OuterVolumeSpecName: "kube-api-access-2l9r2") pod "fd9943d6-a840-4ba5-b12c-9ebf3cbd1224" (UID: "fd9943d6-a840-4ba5-b12c-9ebf3cbd1224"). InnerVolumeSpecName "kube-api-access-2l9r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:36:04 crc kubenswrapper[4860]: I0320 11:36:04.592523 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l9r2\" (UniqueName: \"kubernetes.io/projected/fd9943d6-a840-4ba5-b12c-9ebf3cbd1224-kube-api-access-2l9r2\") on node \"crc\" DevicePath \"\"" Mar 20 11:36:04 crc kubenswrapper[4860]: I0320 11:36:04.993291 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" event={"ID":"fd9943d6-a840-4ba5-b12c-9ebf3cbd1224","Type":"ContainerDied","Data":"4e00d96b74b5759c3798b31ab9f818aa77b10c25b8d17db54a8dd93bdd8425a4"} Mar 20 11:36:04 crc kubenswrapper[4860]: I0320 11:36:04.993845 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e00d96b74b5759c3798b31ab9f818aa77b10c25b8d17db54a8dd93bdd8425a4" Mar 20 11:36:04 crc kubenswrapper[4860]: I0320 11:36:04.993344 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-6j7wf" Mar 20 11:36:05 crc kubenswrapper[4860]: I0320 11:36:05.397548 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-bmm72"] Mar 20 11:36:05 crc kubenswrapper[4860]: I0320 11:36:05.403805 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-bmm72"] Mar 20 11:36:05 crc kubenswrapper[4860]: I0320 11:36:05.424624 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d911fad2-83cb-46e3-8f48-eb6f4b0e5605" path="/var/lib/kubelet/pods/d911fad2-83cb-46e3-8f48-eb6f4b0e5605/volumes" Mar 20 11:36:09 crc kubenswrapper[4860]: I0320 11:36:09.414360 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:36:09 crc kubenswrapper[4860]: E0320 11:36:09.415181 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:36:24 crc kubenswrapper[4860]: I0320 11:36:24.415689 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:36:24 crc kubenswrapper[4860]: E0320 11:36:24.417383 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:36:38 crc kubenswrapper[4860]: I0320 11:36:38.413125 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:36:38 crc kubenswrapper[4860]: E0320 11:36:38.414096 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:36:48 crc kubenswrapper[4860]: I0320 11:36:48.898309 4860 scope.go:117] "RemoveContainer" containerID="6c25fc89bb9294757bf7b8ce97118d32231c76b11a8724a70385e83cc510600a" Mar 20 11:36:49 crc kubenswrapper[4860]: I0320 11:36:49.413870 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:36:49 crc kubenswrapper[4860]: E0320 11:36:49.414488 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:37:02 crc kubenswrapper[4860]: I0320 11:37:02.414037 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:37:02 crc kubenswrapper[4860]: E0320 11:37:02.414907 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:37:17 crc kubenswrapper[4860]: I0320 11:37:17.419085 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:37:17 crc kubenswrapper[4860]: E0320 11:37:17.421972 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:37:28 crc kubenswrapper[4860]: I0320 11:37:28.414135 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:37:28 crc kubenswrapper[4860]: E0320 11:37:28.416856 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:37:39 crc kubenswrapper[4860]: I0320 11:37:39.414026 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:37:39 crc kubenswrapper[4860]: E0320 11:37:39.415243 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:37:52 crc kubenswrapper[4860]: I0320 11:37:52.413689 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:37:52 crc kubenswrapper[4860]: I0320 11:37:52.895623 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"855d8453e4234a7aa607d0c491023b12563d04004d3513d67a5fbedfc07f707d"} Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.152624 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566778-dqv4l"] Mar 20 11:38:00 crc kubenswrapper[4860]: E0320 11:38:00.157416 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9943d6-a840-4ba5-b12c-9ebf3cbd1224" containerName="oc" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.157743 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9943d6-a840-4ba5-b12c-9ebf3cbd1224" containerName="oc" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.158122 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9943d6-a840-4ba5-b12c-9ebf3cbd1224" containerName="oc" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.158993 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-dqv4l" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.162188 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.162290 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.163747 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.164671 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-dqv4l"] Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.173141 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck5pk\" (UniqueName: \"kubernetes.io/projected/04737161-e8a4-4231-8b96-1a617b9561a7-kube-api-access-ck5pk\") pod \"auto-csr-approver-29566778-dqv4l\" (UID: \"04737161-e8a4-4231-8b96-1a617b9561a7\") " pod="openshift-infra/auto-csr-approver-29566778-dqv4l" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.274044 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck5pk\" (UniqueName: \"kubernetes.io/projected/04737161-e8a4-4231-8b96-1a617b9561a7-kube-api-access-ck5pk\") pod \"auto-csr-approver-29566778-dqv4l\" (UID: \"04737161-e8a4-4231-8b96-1a617b9561a7\") " pod="openshift-infra/auto-csr-approver-29566778-dqv4l" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.296929 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck5pk\" (UniqueName: \"kubernetes.io/projected/04737161-e8a4-4231-8b96-1a617b9561a7-kube-api-access-ck5pk\") pod \"auto-csr-approver-29566778-dqv4l\" (UID: \"04737161-e8a4-4231-8b96-1a617b9561a7\") " pod="openshift-infra/auto-csr-approver-29566778-dqv4l" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.482602 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-dqv4l" Mar 20 11:38:00 crc kubenswrapper[4860]: I0320 11:38:00.953908 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-dqv4l"] Mar 20 11:38:01 crc kubenswrapper[4860]: I0320 11:38:01.974669 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-dqv4l" event={"ID":"04737161-e8a4-4231-8b96-1a617b9561a7","Type":"ContainerStarted","Data":"829048024aa9afa867e48a01346be2971ff40c547363a521573193a78b169151"} Mar 20 11:38:02 crc kubenswrapper[4860]: I0320 11:38:02.985506 4860 generic.go:334] "Generic (PLEG): container finished" podID="04737161-e8a4-4231-8b96-1a617b9561a7" containerID="2bf13e1cbb626df84de24a90ddc00424f7dbac653c634127ff56b49722ddadfd" exitCode=0 Mar 20 11:38:02 crc kubenswrapper[4860]: I0320 11:38:02.985590 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-dqv4l" event={"ID":"04737161-e8a4-4231-8b96-1a617b9561a7","Type":"ContainerDied","Data":"2bf13e1cbb626df84de24a90ddc00424f7dbac653c634127ff56b49722ddadfd"} Mar 20 11:38:04 crc kubenswrapper[4860]: I0320 11:38:04.309731 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-dqv4l" Mar 20 11:38:04 crc kubenswrapper[4860]: I0320 11:38:04.443849 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck5pk\" (UniqueName: \"kubernetes.io/projected/04737161-e8a4-4231-8b96-1a617b9561a7-kube-api-access-ck5pk\") pod \"04737161-e8a4-4231-8b96-1a617b9561a7\" (UID: \"04737161-e8a4-4231-8b96-1a617b9561a7\") " Mar 20 11:38:04 crc kubenswrapper[4860]: I0320 11:38:04.451002 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04737161-e8a4-4231-8b96-1a617b9561a7-kube-api-access-ck5pk" (OuterVolumeSpecName: "kube-api-access-ck5pk") pod "04737161-e8a4-4231-8b96-1a617b9561a7" (UID: "04737161-e8a4-4231-8b96-1a617b9561a7"). InnerVolumeSpecName "kube-api-access-ck5pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:38:04 crc kubenswrapper[4860]: I0320 11:38:04.546478 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck5pk\" (UniqueName: \"kubernetes.io/projected/04737161-e8a4-4231-8b96-1a617b9561a7-kube-api-access-ck5pk\") on node \"crc\" DevicePath \"\"" Mar 20 11:38:05 crc kubenswrapper[4860]: I0320 11:38:05.023630 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-dqv4l" event={"ID":"04737161-e8a4-4231-8b96-1a617b9561a7","Type":"ContainerDied","Data":"829048024aa9afa867e48a01346be2971ff40c547363a521573193a78b169151"} Mar 20 11:38:05 crc kubenswrapper[4860]: I0320 11:38:05.023690 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-dqv4l" Mar 20 11:38:05 crc kubenswrapper[4860]: I0320 11:38:05.023701 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="829048024aa9afa867e48a01346be2971ff40c547363a521573193a78b169151" Mar 20 11:38:05 crc kubenswrapper[4860]: I0320 11:38:05.388552 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-bq77p"] Mar 20 11:38:05 crc kubenswrapper[4860]: I0320 11:38:05.395263 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-bq77p"] Mar 20 11:38:05 crc kubenswrapper[4860]: I0320 11:38:05.422850 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e22a4be9-9edf-4029-b504-f5c059318959" path="/var/lib/kubelet/pods/e22a4be9-9edf-4029-b504-f5c059318959/volumes" Mar 20 11:38:49 crc kubenswrapper[4860]: I0320 11:38:49.021407 4860 scope.go:117] "RemoveContainer" containerID="954a495195a9f9d931051a4c1f1eba69bbfb896f6fe4601ca7ac4a6c57e030ea" Mar 20 11:39:52 crc kubenswrapper[4860]: I0320 11:39:52.344737 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:39:52 crc kubenswrapper[4860]: I0320 11:39:52.345837 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.154512 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566780-dpx82"] Mar 20 11:40:00 crc kubenswrapper[4860]: E0320 11:40:00.156498 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04737161-e8a4-4231-8b96-1a617b9561a7" containerName="oc" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.156521 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="04737161-e8a4-4231-8b96-1a617b9561a7" containerName="oc" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.156747 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="04737161-e8a4-4231-8b96-1a617b9561a7" containerName="oc" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.157566 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-dpx82" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.159860 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.160788 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.160976 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.178694 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-dpx82"] Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.263055 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8dwx\" (UniqueName: \"kubernetes.io/projected/0f75e65d-773b-4474-985a-2ca6fea0dc6a-kube-api-access-c8dwx\") pod \"auto-csr-approver-29566780-dpx82\" (UID: \"0f75e65d-773b-4474-985a-2ca6fea0dc6a\") " pod="openshift-infra/auto-csr-approver-29566780-dpx82" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.364918 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8dwx\" (UniqueName: \"kubernetes.io/projected/0f75e65d-773b-4474-985a-2ca6fea0dc6a-kube-api-access-c8dwx\") pod \"auto-csr-approver-29566780-dpx82\" (UID: \"0f75e65d-773b-4474-985a-2ca6fea0dc6a\") " pod="openshift-infra/auto-csr-approver-29566780-dpx82" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.388819 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8dwx\" (UniqueName: \"kubernetes.io/projected/0f75e65d-773b-4474-985a-2ca6fea0dc6a-kube-api-access-c8dwx\") pod \"auto-csr-approver-29566780-dpx82\" (UID: \"0f75e65d-773b-4474-985a-2ca6fea0dc6a\") " pod="openshift-infra/auto-csr-approver-29566780-dpx82" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.485010 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-dpx82" Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.942121 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-dpx82"] Mar 20 11:40:00 crc kubenswrapper[4860]: I0320 11:40:00.954031 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:40:01 crc kubenswrapper[4860]: I0320 11:40:01.957522 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-dpx82" event={"ID":"0f75e65d-773b-4474-985a-2ca6fea0dc6a","Type":"ContainerStarted","Data":"2d388af5cf98d85e1bc39a3ca3083e9b0d64a28e6ef17cf1f4fb688fd499b7e3"} Mar 20 11:40:02 crc kubenswrapper[4860]: I0320 11:40:02.969147 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-dpx82" event={"ID":"0f75e65d-773b-4474-985a-2ca6fea0dc6a","Type":"ContainerStarted","Data":"7535e8513557739754c0e4889cb6b1e5e63acd730d71679b5f7aab2ccfb3bbbd"} Mar 20 11:40:03 crc kubenswrapper[4860]: I0320 11:40:03.978427 4860 generic.go:334] "Generic (PLEG): container finished" podID="0f75e65d-773b-4474-985a-2ca6fea0dc6a" containerID="7535e8513557739754c0e4889cb6b1e5e63acd730d71679b5f7aab2ccfb3bbbd" exitCode=0 Mar 20 11:40:03 crc kubenswrapper[4860]: I0320 11:40:03.978493 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-dpx82" event={"ID":"0f75e65d-773b-4474-985a-2ca6fea0dc6a","Type":"ContainerDied","Data":"7535e8513557739754c0e4889cb6b1e5e63acd730d71679b5f7aab2ccfb3bbbd"} Mar 20 11:40:04 crc kubenswrapper[4860]: I0320 11:40:04.274937 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-dpx82" Mar 20 11:40:04 crc kubenswrapper[4860]: I0320 11:40:04.446979 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8dwx\" (UniqueName: \"kubernetes.io/projected/0f75e65d-773b-4474-985a-2ca6fea0dc6a-kube-api-access-c8dwx\") pod \"0f75e65d-773b-4474-985a-2ca6fea0dc6a\" (UID: \"0f75e65d-773b-4474-985a-2ca6fea0dc6a\") " Mar 20 11:40:04 crc kubenswrapper[4860]: I0320 11:40:04.455051 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f75e65d-773b-4474-985a-2ca6fea0dc6a-kube-api-access-c8dwx" (OuterVolumeSpecName: "kube-api-access-c8dwx") pod "0f75e65d-773b-4474-985a-2ca6fea0dc6a" (UID: "0f75e65d-773b-4474-985a-2ca6fea0dc6a"). InnerVolumeSpecName "kube-api-access-c8dwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:40:04 crc kubenswrapper[4860]: I0320 11:40:04.549078 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8dwx\" (UniqueName: \"kubernetes.io/projected/0f75e65d-773b-4474-985a-2ca6fea0dc6a-kube-api-access-c8dwx\") on node \"crc\" DevicePath \"\"" Mar 20 11:40:04 crc kubenswrapper[4860]: I0320 11:40:04.988608 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-dpx82" event={"ID":"0f75e65d-773b-4474-985a-2ca6fea0dc6a","Type":"ContainerDied","Data":"2d388af5cf98d85e1bc39a3ca3083e9b0d64a28e6ef17cf1f4fb688fd499b7e3"} Mar 20 11:40:04 crc kubenswrapper[4860]: I0320 11:40:04.988648 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-dpx82" Mar 20 11:40:04 crc kubenswrapper[4860]: I0320 11:40:04.988670 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d388af5cf98d85e1bc39a3ca3083e9b0d64a28e6ef17cf1f4fb688fd499b7e3" Mar 20 11:40:05 crc kubenswrapper[4860]: I0320 11:40:05.359930 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-jmb8l"] Mar 20 11:40:05 crc kubenswrapper[4860]: I0320 11:40:05.366657 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-jmb8l"] Mar 20 11:40:05 crc kubenswrapper[4860]: I0320 11:40:05.422436 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34154cfb-cf91-40a0-8390-78823b11b698" path="/var/lib/kubelet/pods/34154cfb-cf91-40a0-8390-78823b11b698/volumes" Mar 20 11:40:22 crc kubenswrapper[4860]: I0320 11:40:22.343795 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:40:22 crc kubenswrapper[4860]: I0320 11:40:22.344708 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:40:49 crc kubenswrapper[4860]: I0320 11:40:49.111338 4860 scope.go:117] "RemoveContainer" containerID="bd1a330fd37e3c9d04a847aabc8c53a75648f5f571a68d378c964de8d51bbab7" Mar 20 11:40:52 crc kubenswrapper[4860]: I0320 11:40:52.343895 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:40:52 crc kubenswrapper[4860]: I0320 11:40:52.344507 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:40:52 crc kubenswrapper[4860]: I0320 11:40:52.344570 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:40:52 crc kubenswrapper[4860]: I0320 11:40:52.345482 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"855d8453e4234a7aa607d0c491023b12563d04004d3513d67a5fbedfc07f707d"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:40:52 crc kubenswrapper[4860]: I0320 11:40:52.345542 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://855d8453e4234a7aa607d0c491023b12563d04004d3513d67a5fbedfc07f707d" gracePeriod=600 Mar 20 11:40:53 crc kubenswrapper[4860]: I0320 11:40:53.393057 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="855d8453e4234a7aa607d0c491023b12563d04004d3513d67a5fbedfc07f707d" exitCode=0 Mar 20 11:40:53 crc kubenswrapper[4860]: I0320 11:40:53.393133 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"855d8453e4234a7aa607d0c491023b12563d04004d3513d67a5fbedfc07f707d"} Mar 20 11:40:53 crc kubenswrapper[4860]: I0320 11:40:53.393898 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d"} Mar 20 11:40:53 crc kubenswrapper[4860]: I0320 11:40:53.393945 4860 scope.go:117] "RemoveContainer" containerID="174cf0258fb93c52c964e89ccb7ea77d9cb0e4fb77f37a49b81be365390d1f67" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.621694 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7pbqv"] Mar 20 11:41:34 crc kubenswrapper[4860]: E0320 11:41:34.622974 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f75e65d-773b-4474-985a-2ca6fea0dc6a" containerName="oc" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.622990 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f75e65d-773b-4474-985a-2ca6fea0dc6a" containerName="oc" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.623157 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f75e65d-773b-4474-985a-2ca6fea0dc6a" containerName="oc" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.624609 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.636631 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7pbqv"] Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.790862 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-utilities\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.791520 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-catalog-content\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.791582 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrf7f\" (UniqueName: \"kubernetes.io/projected/c2af21af-50b4-4c92-9d39-40c326084305-kube-api-access-hrf7f\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.893249 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-utilities\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.893330 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-catalog-content\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.893405 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrf7f\" (UniqueName: \"kubernetes.io/projected/c2af21af-50b4-4c92-9d39-40c326084305-kube-api-access-hrf7f\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.893980 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-utilities\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.894184 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-catalog-content\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.921748 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrf7f\" (UniqueName: \"kubernetes.io/projected/c2af21af-50b4-4c92-9d39-40c326084305-kube-api-access-hrf7f\") pod \"community-operators-7pbqv\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:34 crc kubenswrapper[4860]: I0320 11:41:34.948252 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:35 crc kubenswrapper[4860]: I0320 11:41:35.525383 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7pbqv"] Mar 20 11:41:35 crc kubenswrapper[4860]: I0320 11:41:35.718506 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pbqv" event={"ID":"c2af21af-50b4-4c92-9d39-40c326084305","Type":"ContainerStarted","Data":"e5ac5ddd5c069f37405eb54fe0ce2a92994ce39706f8b4d4aa25212406b4a77b"} Mar 20 11:41:36 crc kubenswrapper[4860]: I0320 11:41:36.731091 4860 generic.go:334] "Generic (PLEG): container finished" podID="c2af21af-50b4-4c92-9d39-40c326084305" containerID="e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179" exitCode=0 Mar 20 11:41:36 crc kubenswrapper[4860]: I0320 11:41:36.731175 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pbqv" event={"ID":"c2af21af-50b4-4c92-9d39-40c326084305","Type":"ContainerDied","Data":"e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179"} Mar 20 11:41:39 crc kubenswrapper[4860]: I0320 11:41:39.757512 4860 generic.go:334] "Generic (PLEG): container finished" podID="c2af21af-50b4-4c92-9d39-40c326084305" containerID="75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8" exitCode=0 Mar 20 11:41:39 crc kubenswrapper[4860]: I0320 11:41:39.757589 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pbqv" event={"ID":"c2af21af-50b4-4c92-9d39-40c326084305","Type":"ContainerDied","Data":"75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8"} Mar 20 11:41:42 crc kubenswrapper[4860]: I0320 11:41:42.801699 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pbqv" event={"ID":"c2af21af-50b4-4c92-9d39-40c326084305","Type":"ContainerStarted","Data":"1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0"} Mar 20 11:41:42 crc kubenswrapper[4860]: I0320 11:41:42.825788 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7pbqv" podStartSLOduration=3.512832007 podStartE2EDuration="8.825764183s" podCreationTimestamp="2026-03-20 11:41:34 +0000 UTC" firstStartedPulling="2026-03-20 11:41:36.734061346 +0000 UTC m=+2820.955422244" lastFinishedPulling="2026-03-20 11:41:42.046993522 +0000 UTC m=+2826.268354420" observedRunningTime="2026-03-20 11:41:42.825742172 +0000 UTC m=+2827.047103080" watchObservedRunningTime="2026-03-20 11:41:42.825764183 +0000 UTC m=+2827.047125081" Mar 20 11:41:44 crc kubenswrapper[4860]: I0320 11:41:44.948698 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:44 crc kubenswrapper[4860]: I0320 11:41:44.949159 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:45 crc kubenswrapper[4860]: I0320 11:41:45.001740 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:54 crc kubenswrapper[4860]: I0320 11:41:54.999027 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:55 crc kubenswrapper[4860]: I0320 11:41:55.060249 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7pbqv"] Mar 20 11:41:55 crc kubenswrapper[4860]: I0320 11:41:55.900010 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7pbqv" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="registry-server" containerID="cri-o://1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0" gracePeriod=2 Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.348702 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.482801 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrf7f\" (UniqueName: \"kubernetes.io/projected/c2af21af-50b4-4c92-9d39-40c326084305-kube-api-access-hrf7f\") pod \"c2af21af-50b4-4c92-9d39-40c326084305\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.482959 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-catalog-content\") pod \"c2af21af-50b4-4c92-9d39-40c326084305\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.491448 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-utilities\") pod \"c2af21af-50b4-4c92-9d39-40c326084305\" (UID: \"c2af21af-50b4-4c92-9d39-40c326084305\") " Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.492453 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-utilities" (OuterVolumeSpecName: "utilities") pod "c2af21af-50b4-4c92-9d39-40c326084305" (UID: "c2af21af-50b4-4c92-9d39-40c326084305"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.495498 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2af21af-50b4-4c92-9d39-40c326084305-kube-api-access-hrf7f" (OuterVolumeSpecName: "kube-api-access-hrf7f") pod "c2af21af-50b4-4c92-9d39-40c326084305" (UID: "c2af21af-50b4-4c92-9d39-40c326084305"). InnerVolumeSpecName "kube-api-access-hrf7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.540915 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2af21af-50b4-4c92-9d39-40c326084305" (UID: "c2af21af-50b4-4c92-9d39-40c326084305"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.601541 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrf7f\" (UniqueName: \"kubernetes.io/projected/c2af21af-50b4-4c92-9d39-40c326084305-kube-api-access-hrf7f\") on node \"crc\" DevicePath \"\"" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.601577 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.601589 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2af21af-50b4-4c92-9d39-40c326084305-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.910257 4860 generic.go:334] "Generic (PLEG): container finished" podID="c2af21af-50b4-4c92-9d39-40c326084305" containerID="1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0" exitCode=0 Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.910312 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pbqv" event={"ID":"c2af21af-50b4-4c92-9d39-40c326084305","Type":"ContainerDied","Data":"1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0"} Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.910355 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pbqv" event={"ID":"c2af21af-50b4-4c92-9d39-40c326084305","Type":"ContainerDied","Data":"e5ac5ddd5c069f37405eb54fe0ce2a92994ce39706f8b4d4aa25212406b4a77b"} Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.910375 4860 scope.go:117] "RemoveContainer" containerID="1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.910379 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pbqv" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.935770 4860 scope.go:117] "RemoveContainer" containerID="75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.950886 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7pbqv"] Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.958448 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7pbqv"] Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.961116 4860 scope.go:117] "RemoveContainer" containerID="e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.988307 4860 scope.go:117] "RemoveContainer" containerID="1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0" Mar 20 11:41:56 crc kubenswrapper[4860]: E0320 11:41:56.988993 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0\": container with ID starting with 1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0 not found: ID does not exist" containerID="1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.989077 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0"} err="failed to get container status \"1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0\": rpc error: code = NotFound desc = could not find container \"1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0\": container with ID starting with 1d10f0d6586195959bd87e5f3f283c5d0718b91b40a5ef2ac461e5f70cc27bb0 not found: ID does not exist" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.989129 4860 scope.go:117] "RemoveContainer" containerID="75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8" Mar 20 11:41:56 crc kubenswrapper[4860]: E0320 11:41:56.989540 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8\": container with ID starting with 75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8 not found: ID does not exist" containerID="75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.989588 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8"} err="failed to get container status \"75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8\": rpc error: code = NotFound desc = could not find container \"75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8\": container with ID starting with 75b271fbe5c04e26afe7485152c4661d6f947a3f32fdbb61994bf363b9f198d8 not found: ID does not exist" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.989625 4860 scope.go:117] "RemoveContainer" containerID="e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179" Mar 20 11:41:56 crc kubenswrapper[4860]: E0320 11:41:56.989992 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179\": container with ID starting with e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179 not found: ID does not exist" containerID="e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179" Mar 20 11:41:56 crc kubenswrapper[4860]: I0320 11:41:56.990029 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179"} err="failed to get container status \"e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179\": rpc error: code = NotFound desc = could not find container \"e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179\": container with ID starting with e56de3eee5e35cceaccf8c3589499f33bf561e49b0aa363dbe5ed01f4a6b7179 not found: ID does not exist" Mar 20 11:41:57 crc kubenswrapper[4860]: I0320 11:41:57.424660 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2af21af-50b4-4c92-9d39-40c326084305" path="/var/lib/kubelet/pods/c2af21af-50b4-4c92-9d39-40c326084305/volumes" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.151507 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566782-lzh8h"] Mar 20 11:42:00 crc kubenswrapper[4860]: E0320 11:42:00.152942 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="registry-server" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.152966 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="registry-server" Mar 20 11:42:00 crc kubenswrapper[4860]: E0320 11:42:00.152983 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="extract-utilities" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.152994 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="extract-utilities" Mar 20 11:42:00 crc kubenswrapper[4860]: E0320 11:42:00.153004 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="extract-content" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.153016 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="extract-content" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.153262 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2af21af-50b4-4c92-9d39-40c326084305" containerName="registry-server" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.153978 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-lzh8h" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.158134 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.158287 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.158489 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.168739 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-lzh8h"] Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.259692 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpxk7\" (UniqueName: \"kubernetes.io/projected/cfc74c40-1bf8-47fb-91a4-a6e27724dff9-kube-api-access-wpxk7\") pod \"auto-csr-approver-29566782-lzh8h\" (UID: \"cfc74c40-1bf8-47fb-91a4-a6e27724dff9\") " pod="openshift-infra/auto-csr-approver-29566782-lzh8h" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.361608 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpxk7\" (UniqueName: \"kubernetes.io/projected/cfc74c40-1bf8-47fb-91a4-a6e27724dff9-kube-api-access-wpxk7\") pod \"auto-csr-approver-29566782-lzh8h\" (UID: \"cfc74c40-1bf8-47fb-91a4-a6e27724dff9\") " pod="openshift-infra/auto-csr-approver-29566782-lzh8h" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.387520 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpxk7\" (UniqueName: \"kubernetes.io/projected/cfc74c40-1bf8-47fb-91a4-a6e27724dff9-kube-api-access-wpxk7\") pod \"auto-csr-approver-29566782-lzh8h\" (UID: \"cfc74c40-1bf8-47fb-91a4-a6e27724dff9\") " pod="openshift-infra/auto-csr-approver-29566782-lzh8h" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.477879 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-lzh8h" Mar 20 11:42:00 crc kubenswrapper[4860]: I0320 11:42:00.985237 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-lzh8h"] Mar 20 11:42:01 crc kubenswrapper[4860]: I0320 11:42:01.955118 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-lzh8h" event={"ID":"cfc74c40-1bf8-47fb-91a4-a6e27724dff9","Type":"ContainerStarted","Data":"98d3d86ac7e10d74407be23825b9d1285c1fb4fb54635183d9c78ae08f65aa5a"} Mar 20 11:42:02 crc kubenswrapper[4860]: I0320 11:42:02.964686 4860 generic.go:334] "Generic (PLEG): container finished" podID="cfc74c40-1bf8-47fb-91a4-a6e27724dff9" containerID="9f2e26a1e8f88c5ac76c8ad0792718b788b15449739711bc2614bdd4cd541855" exitCode=0 Mar 20 11:42:02 crc kubenswrapper[4860]: I0320 11:42:02.964867 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-lzh8h" event={"ID":"cfc74c40-1bf8-47fb-91a4-a6e27724dff9","Type":"ContainerDied","Data":"9f2e26a1e8f88c5ac76c8ad0792718b788b15449739711bc2614bdd4cd541855"} Mar 20 11:42:04 crc kubenswrapper[4860]: I0320 11:42:04.266264 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-lzh8h" Mar 20 11:42:04 crc kubenswrapper[4860]: I0320 11:42:04.435620 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpxk7\" (UniqueName: \"kubernetes.io/projected/cfc74c40-1bf8-47fb-91a4-a6e27724dff9-kube-api-access-wpxk7\") pod \"cfc74c40-1bf8-47fb-91a4-a6e27724dff9\" (UID: \"cfc74c40-1bf8-47fb-91a4-a6e27724dff9\") " Mar 20 11:42:04 crc kubenswrapper[4860]: I0320 11:42:04.443998 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc74c40-1bf8-47fb-91a4-a6e27724dff9-kube-api-access-wpxk7" (OuterVolumeSpecName: "kube-api-access-wpxk7") pod "cfc74c40-1bf8-47fb-91a4-a6e27724dff9" (UID: "cfc74c40-1bf8-47fb-91a4-a6e27724dff9"). InnerVolumeSpecName "kube-api-access-wpxk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:42:04 crc kubenswrapper[4860]: I0320 11:42:04.537861 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpxk7\" (UniqueName: \"kubernetes.io/projected/cfc74c40-1bf8-47fb-91a4-a6e27724dff9-kube-api-access-wpxk7\") on node \"crc\" DevicePath \"\"" Mar 20 11:42:04 crc kubenswrapper[4860]: I0320 11:42:04.982317 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-lzh8h" event={"ID":"cfc74c40-1bf8-47fb-91a4-a6e27724dff9","Type":"ContainerDied","Data":"98d3d86ac7e10d74407be23825b9d1285c1fb4fb54635183d9c78ae08f65aa5a"} Mar 20 11:42:04 crc kubenswrapper[4860]: I0320 11:42:04.982719 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98d3d86ac7e10d74407be23825b9d1285c1fb4fb54635183d9c78ae08f65aa5a" Mar 20 11:42:04 crc kubenswrapper[4860]: I0320 11:42:04.982382 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-lzh8h" Mar 20 11:42:05 crc kubenswrapper[4860]: I0320 11:42:05.358431 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-6j7wf"] Mar 20 11:42:05 crc kubenswrapper[4860]: I0320 11:42:05.377148 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-6j7wf"] Mar 20 11:42:05 crc kubenswrapper[4860]: I0320 11:42:05.422954 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd9943d6-a840-4ba5-b12c-9ebf3cbd1224" path="/var/lib/kubelet/pods/fd9943d6-a840-4ba5-b12c-9ebf3cbd1224/volumes" Mar 20 11:42:49 crc kubenswrapper[4860]: I0320 11:42:49.214213 4860 scope.go:117] "RemoveContainer" containerID="d289c9e2018c59ba8e3c1ee7b7a591cd4418a9530b3b771d0a13873b671d4bdf" Mar 20 11:42:52 crc kubenswrapper[4860]: I0320 11:42:52.343957 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:42:52 crc kubenswrapper[4860]: I0320 11:42:52.344434 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.496631 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fzc9q"] Mar 20 11:43:02 crc kubenswrapper[4860]: E0320 11:43:02.498394 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc74c40-1bf8-47fb-91a4-a6e27724dff9" containerName="oc" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.498410 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc74c40-1bf8-47fb-91a4-a6e27724dff9" containerName="oc" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.498582 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc74c40-1bf8-47fb-91a4-a6e27724dff9" containerName="oc" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.499736 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.513527 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fzc9q"] Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.621301 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-catalog-content\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.621407 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-utilities\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.621531 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwdmj\" (UniqueName: \"kubernetes.io/projected/83074f2a-d218-47f7-8c37-8b5195f77210-kube-api-access-jwdmj\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.722847 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-catalog-content\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.722922 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-utilities\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.722952 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwdmj\" (UniqueName: \"kubernetes.io/projected/83074f2a-d218-47f7-8c37-8b5195f77210-kube-api-access-jwdmj\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.723524 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-catalog-content\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.723641 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-utilities\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.745884 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwdmj\" (UniqueName: \"kubernetes.io/projected/83074f2a-d218-47f7-8c37-8b5195f77210-kube-api-access-jwdmj\") pod \"redhat-operators-fzc9q\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:02 crc kubenswrapper[4860]: I0320 11:43:02.819775 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:03 crc kubenswrapper[4860]: I0320 11:43:03.160699 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fzc9q"] Mar 20 11:43:03 crc kubenswrapper[4860]: I0320 11:43:03.475409 4860 generic.go:334] "Generic (PLEG): container finished" podID="83074f2a-d218-47f7-8c37-8b5195f77210" containerID="360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d" exitCode=0 Mar 20 11:43:03 crc kubenswrapper[4860]: I0320 11:43:03.475931 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzc9q" event={"ID":"83074f2a-d218-47f7-8c37-8b5195f77210","Type":"ContainerDied","Data":"360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d"} Mar 20 11:43:03 crc kubenswrapper[4860]: I0320 11:43:03.475969 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzc9q" event={"ID":"83074f2a-d218-47f7-8c37-8b5195f77210","Type":"ContainerStarted","Data":"8c2d1e411a59750e89077ddc91c2309ab3e7983b7e0eb52dca2b617cef84b7ff"} Mar 20 11:43:05 crc kubenswrapper[4860]: I0320 11:43:05.494069 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzc9q" event={"ID":"83074f2a-d218-47f7-8c37-8b5195f77210","Type":"ContainerStarted","Data":"f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda"} Mar 20 11:43:06 crc kubenswrapper[4860]: I0320 11:43:06.505159 4860 generic.go:334] "Generic (PLEG): container finished" podID="83074f2a-d218-47f7-8c37-8b5195f77210" containerID="f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda" exitCode=0 Mar 20 11:43:06 crc kubenswrapper[4860]: I0320 11:43:06.505255 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzc9q" event={"ID":"83074f2a-d218-47f7-8c37-8b5195f77210","Type":"ContainerDied","Data":"f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda"} Mar 20 11:43:07 crc kubenswrapper[4860]: I0320 11:43:07.518443 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzc9q" event={"ID":"83074f2a-d218-47f7-8c37-8b5195f77210","Type":"ContainerStarted","Data":"de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578"} Mar 20 11:43:07 crc kubenswrapper[4860]: I0320 11:43:07.551688 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fzc9q" podStartSLOduration=1.998661717 podStartE2EDuration="5.551665412s" podCreationTimestamp="2026-03-20 11:43:02 +0000 UTC" firstStartedPulling="2026-03-20 11:43:03.477859127 +0000 UTC m=+2907.699220025" lastFinishedPulling="2026-03-20 11:43:07.030862782 +0000 UTC m=+2911.252223720" observedRunningTime="2026-03-20 11:43:07.545343712 +0000 UTC m=+2911.766704610" watchObservedRunningTime="2026-03-20 11:43:07.551665412 +0000 UTC m=+2911.773026311" Mar 20 11:43:12 crc kubenswrapper[4860]: I0320 11:43:12.820078 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:12 crc kubenswrapper[4860]: I0320 11:43:12.823515 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:13 crc kubenswrapper[4860]: I0320 11:43:13.880105 4860 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fzc9q" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="registry-server" probeResult="failure" output=< Mar 20 11:43:13 crc kubenswrapper[4860]: timeout: failed to connect service ":50051" within 1s Mar 20 11:43:13 crc kubenswrapper[4860]: > Mar 20 11:43:22 crc kubenswrapper[4860]: I0320 11:43:22.344071 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:43:22 crc kubenswrapper[4860]: I0320 11:43:22.344645 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:43:22 crc kubenswrapper[4860]: I0320 11:43:22.870997 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:22 crc kubenswrapper[4860]: I0320 11:43:22.925256 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:23 crc kubenswrapper[4860]: I0320 11:43:23.110490 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fzc9q"] Mar 20 11:43:24 crc kubenswrapper[4860]: I0320 11:43:24.659543 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fzc9q" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="registry-server" containerID="cri-o://de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578" gracePeriod=2 Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.085969 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.133630 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-catalog-content\") pod \"83074f2a-d218-47f7-8c37-8b5195f77210\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.133804 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwdmj\" (UniqueName: \"kubernetes.io/projected/83074f2a-d218-47f7-8c37-8b5195f77210-kube-api-access-jwdmj\") pod \"83074f2a-d218-47f7-8c37-8b5195f77210\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.133848 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-utilities\") pod \"83074f2a-d218-47f7-8c37-8b5195f77210\" (UID: \"83074f2a-d218-47f7-8c37-8b5195f77210\") " Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.134991 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-utilities" (OuterVolumeSpecName: "utilities") pod "83074f2a-d218-47f7-8c37-8b5195f77210" (UID: "83074f2a-d218-47f7-8c37-8b5195f77210"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.143633 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83074f2a-d218-47f7-8c37-8b5195f77210-kube-api-access-jwdmj" (OuterVolumeSpecName: "kube-api-access-jwdmj") pod "83074f2a-d218-47f7-8c37-8b5195f77210" (UID: "83074f2a-d218-47f7-8c37-8b5195f77210"). InnerVolumeSpecName "kube-api-access-jwdmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.236780 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwdmj\" (UniqueName: \"kubernetes.io/projected/83074f2a-d218-47f7-8c37-8b5195f77210-kube-api-access-jwdmj\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.236833 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.272555 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83074f2a-d218-47f7-8c37-8b5195f77210" (UID: "83074f2a-d218-47f7-8c37-8b5195f77210"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.337948 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83074f2a-d218-47f7-8c37-8b5195f77210-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.671866 4860 generic.go:334] "Generic (PLEG): container finished" podID="83074f2a-d218-47f7-8c37-8b5195f77210" containerID="de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578" exitCode=0 Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.671908 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzc9q" event={"ID":"83074f2a-d218-47f7-8c37-8b5195f77210","Type":"ContainerDied","Data":"de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578"} Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.671972 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzc9q" event={"ID":"83074f2a-d218-47f7-8c37-8b5195f77210","Type":"ContainerDied","Data":"8c2d1e411a59750e89077ddc91c2309ab3e7983b7e0eb52dca2b617cef84b7ff"} Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.671993 4860 scope.go:117] "RemoveContainer" containerID="de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.672034 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzc9q" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.692960 4860 scope.go:117] "RemoveContainer" containerID="f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.702970 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fzc9q"] Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.708999 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fzc9q"] Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.718931 4860 scope.go:117] "RemoveContainer" containerID="360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.737050 4860 scope.go:117] "RemoveContainer" containerID="de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578" Mar 20 11:43:25 crc kubenswrapper[4860]: E0320 11:43:25.737582 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578\": container with ID starting with de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578 not found: ID does not exist" containerID="de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.737619 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578"} err="failed to get container status \"de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578\": rpc error: code = NotFound desc = could not find container \"de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578\": container with ID starting with de6e2e6924a1a544a7dd810a70e0a5780b512d6321f32637c1a4e4c4baf53578 not found: ID does not exist" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.737726 4860 scope.go:117] "RemoveContainer" containerID="f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda" Mar 20 11:43:25 crc kubenswrapper[4860]: E0320 11:43:25.738099 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda\": container with ID starting with f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda not found: ID does not exist" containerID="f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.738136 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda"} err="failed to get container status \"f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda\": rpc error: code = NotFound desc = could not find container \"f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda\": container with ID starting with f056246561ac73e753fd4dcd7635df4308d4205a262f90e122068c8479290fda not found: ID does not exist" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.738154 4860 scope.go:117] "RemoveContainer" containerID="360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d" Mar 20 11:43:25 crc kubenswrapper[4860]: E0320 11:43:25.738454 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d\": container with ID starting with 360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d not found: ID does not exist" containerID="360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d" Mar 20 11:43:25 crc kubenswrapper[4860]: I0320 11:43:25.738500 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d"} err="failed to get container status \"360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d\": rpc error: code = NotFound desc = could not find container \"360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d\": container with ID starting with 360f2a33729f08c427fc3929d1af0e724d578c412b1459b26ec5648e8ddf3a0d not found: ID does not exist" Mar 20 11:43:27 crc kubenswrapper[4860]: I0320 11:43:27.426789 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" path="/var/lib/kubelet/pods/83074f2a-d218-47f7-8c37-8b5195f77210/volumes" Mar 20 11:43:52 crc kubenswrapper[4860]: I0320 11:43:52.344602 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:43:52 crc kubenswrapper[4860]: I0320 11:43:52.345547 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:43:52 crc kubenswrapper[4860]: I0320 11:43:52.345606 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:43:52 crc kubenswrapper[4860]: I0320 11:43:52.346374 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:43:52 crc kubenswrapper[4860]: I0320 11:43:52.346448 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" gracePeriod=600 Mar 20 11:43:52 crc kubenswrapper[4860]: E0320 11:43:52.468768 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.058127 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" exitCode=0 Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.058168 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d"} Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.058260 4860 scope.go:117] "RemoveContainer" containerID="855d8453e4234a7aa607d0c491023b12563d04004d3513d67a5fbedfc07f707d" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.058828 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:43:53 crc kubenswrapper[4860]: E0320 11:43:53.059062 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.554875 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d78wv"] Mar 20 11:43:53 crc kubenswrapper[4860]: E0320 11:43:53.555306 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="registry-server" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.555321 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="registry-server" Mar 20 11:43:53 crc kubenswrapper[4860]: E0320 11:43:53.555341 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="extract-utilities" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.555347 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="extract-utilities" Mar 20 11:43:53 crc kubenswrapper[4860]: E0320 11:43:53.555354 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="extract-content" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.555362 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="extract-content" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.555493 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="83074f2a-d218-47f7-8c37-8b5195f77210" containerName="registry-server" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.556656 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.570904 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d78wv"] Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.708412 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-utilities\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.708502 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxl7f\" (UniqueName: \"kubernetes.io/projected/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-kube-api-access-gxl7f\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.708526 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-catalog-content\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.809541 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-utilities\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.809644 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxl7f\" (UniqueName: \"kubernetes.io/projected/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-kube-api-access-gxl7f\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.809669 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-catalog-content\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.810071 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-utilities\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.810187 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-catalog-content\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.836098 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxl7f\" (UniqueName: \"kubernetes.io/projected/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-kube-api-access-gxl7f\") pod \"redhat-marketplace-d78wv\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:53 crc kubenswrapper[4860]: I0320 11:43:53.875648 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:43:54 crc kubenswrapper[4860]: I0320 11:43:54.372628 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d78wv"] Mar 20 11:43:55 crc kubenswrapper[4860]: I0320 11:43:55.102680 4860 generic.go:334] "Generic (PLEG): container finished" podID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerID="475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182" exitCode=0 Mar 20 11:43:55 crc kubenswrapper[4860]: I0320 11:43:55.102912 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d78wv" event={"ID":"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a","Type":"ContainerDied","Data":"475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182"} Mar 20 11:43:55 crc kubenswrapper[4860]: I0320 11:43:55.103165 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d78wv" event={"ID":"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a","Type":"ContainerStarted","Data":"8c98d7e4ab87dca42cddfef62a946a203e0380f2d3bb440c1fe269c8f7a89579"} Mar 20 11:43:56 crc kubenswrapper[4860]: I0320 11:43:56.115850 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d78wv" event={"ID":"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a","Type":"ContainerStarted","Data":"0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176"} Mar 20 11:43:57 crc kubenswrapper[4860]: I0320 11:43:57.123788 4860 generic.go:334] "Generic (PLEG): container finished" podID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerID="0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176" exitCode=0 Mar 20 11:43:57 crc kubenswrapper[4860]: I0320 11:43:57.123840 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d78wv" event={"ID":"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a","Type":"ContainerDied","Data":"0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176"} Mar 20 11:43:58 crc kubenswrapper[4860]: I0320 11:43:58.156658 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d78wv" event={"ID":"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a","Type":"ContainerStarted","Data":"d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c"} Mar 20 11:43:58 crc kubenswrapper[4860]: I0320 11:43:58.189787 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d78wv" podStartSLOduration=2.775757859 podStartE2EDuration="5.189761093s" podCreationTimestamp="2026-03-20 11:43:53 +0000 UTC" firstStartedPulling="2026-03-20 11:43:55.105542007 +0000 UTC m=+2959.326902905" lastFinishedPulling="2026-03-20 11:43:57.519545241 +0000 UTC m=+2961.740906139" observedRunningTime="2026-03-20 11:43:58.181893921 +0000 UTC m=+2962.403254839" watchObservedRunningTime="2026-03-20 11:43:58.189761093 +0000 UTC m=+2962.411121991" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.171711 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566784-k8pc4"] Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.173057 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-k8pc4" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.176272 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.176749 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.180685 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.197262 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-k8pc4"] Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.217441 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpdx4\" (UniqueName: \"kubernetes.io/projected/1d3485ef-9918-4aa6-80d1-c1c295d46ebe-kube-api-access-dpdx4\") pod \"auto-csr-approver-29566784-k8pc4\" (UID: \"1d3485ef-9918-4aa6-80d1-c1c295d46ebe\") " pod="openshift-infra/auto-csr-approver-29566784-k8pc4" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.319103 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpdx4\" (UniqueName: \"kubernetes.io/projected/1d3485ef-9918-4aa6-80d1-c1c295d46ebe-kube-api-access-dpdx4\") pod \"auto-csr-approver-29566784-k8pc4\" (UID: \"1d3485ef-9918-4aa6-80d1-c1c295d46ebe\") " pod="openshift-infra/auto-csr-approver-29566784-k8pc4" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.348288 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpdx4\" (UniqueName: \"kubernetes.io/projected/1d3485ef-9918-4aa6-80d1-c1c295d46ebe-kube-api-access-dpdx4\") pod \"auto-csr-approver-29566784-k8pc4\" (UID: \"1d3485ef-9918-4aa6-80d1-c1c295d46ebe\") " pod="openshift-infra/auto-csr-approver-29566784-k8pc4" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.490587 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-k8pc4" Mar 20 11:44:00 crc kubenswrapper[4860]: I0320 11:44:00.953978 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-k8pc4"] Mar 20 11:44:01 crc kubenswrapper[4860]: I0320 11:44:01.180187 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566784-k8pc4" event={"ID":"1d3485ef-9918-4aa6-80d1-c1c295d46ebe","Type":"ContainerStarted","Data":"ff3cf6fead59b96d5552c0911ddb64bba5a8801af8dfced69f64cf229996bd7a"} Mar 20 11:44:03 crc kubenswrapper[4860]: I0320 11:44:03.197423 4860 generic.go:334] "Generic (PLEG): container finished" podID="1d3485ef-9918-4aa6-80d1-c1c295d46ebe" containerID="7c62f1c8ef0515a28ab838f145210c3776f9c242b812e79c909a339bcd0bc452" exitCode=0 Mar 20 11:44:03 crc kubenswrapper[4860]: I0320 11:44:03.197538 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566784-k8pc4" event={"ID":"1d3485ef-9918-4aa6-80d1-c1c295d46ebe","Type":"ContainerDied","Data":"7c62f1c8ef0515a28ab838f145210c3776f9c242b812e79c909a339bcd0bc452"} Mar 20 11:44:03 crc kubenswrapper[4860]: I0320 11:44:03.877462 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:44:03 crc kubenswrapper[4860]: I0320 11:44:03.877545 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:44:03 crc kubenswrapper[4860]: I0320 11:44:03.931970 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:44:04 crc kubenswrapper[4860]: I0320 11:44:04.254296 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:44:04 crc kubenswrapper[4860]: I0320 11:44:04.324254 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d78wv"] Mar 20 11:44:04 crc kubenswrapper[4860]: I0320 11:44:04.496045 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-k8pc4" Mar 20 11:44:04 crc kubenswrapper[4860]: I0320 11:44:04.705861 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpdx4\" (UniqueName: \"kubernetes.io/projected/1d3485ef-9918-4aa6-80d1-c1c295d46ebe-kube-api-access-dpdx4\") pod \"1d3485ef-9918-4aa6-80d1-c1c295d46ebe\" (UID: \"1d3485ef-9918-4aa6-80d1-c1c295d46ebe\") " Mar 20 11:44:04 crc kubenswrapper[4860]: I0320 11:44:04.715479 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d3485ef-9918-4aa6-80d1-c1c295d46ebe-kube-api-access-dpdx4" (OuterVolumeSpecName: "kube-api-access-dpdx4") pod "1d3485ef-9918-4aa6-80d1-c1c295d46ebe" (UID: "1d3485ef-9918-4aa6-80d1-c1c295d46ebe"). InnerVolumeSpecName "kube-api-access-dpdx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:44:04 crc kubenswrapper[4860]: I0320 11:44:04.807742 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpdx4\" (UniqueName: \"kubernetes.io/projected/1d3485ef-9918-4aa6-80d1-c1c295d46ebe-kube-api-access-dpdx4\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:05 crc kubenswrapper[4860]: I0320 11:44:05.216563 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-k8pc4" Mar 20 11:44:05 crc kubenswrapper[4860]: I0320 11:44:05.216535 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566784-k8pc4" event={"ID":"1d3485ef-9918-4aa6-80d1-c1c295d46ebe","Type":"ContainerDied","Data":"ff3cf6fead59b96d5552c0911ddb64bba5a8801af8dfced69f64cf229996bd7a"} Mar 20 11:44:05 crc kubenswrapper[4860]: I0320 11:44:05.216630 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff3cf6fead59b96d5552c0911ddb64bba5a8801af8dfced69f64cf229996bd7a" Mar 20 11:44:05 crc kubenswrapper[4860]: I0320 11:44:05.413746 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:44:05 crc kubenswrapper[4860]: E0320 11:44:05.413985 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:44:05 crc kubenswrapper[4860]: I0320 11:44:05.597651 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-dqv4l"] Mar 20 11:44:05 crc kubenswrapper[4860]: I0320 11:44:05.604687 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-dqv4l"] Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.223653 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d78wv" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="registry-server" containerID="cri-o://d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c" gracePeriod=2 Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.635357 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.737705 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxl7f\" (UniqueName: \"kubernetes.io/projected/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-kube-api-access-gxl7f\") pod \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.737842 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-catalog-content\") pod \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.738006 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-utilities\") pod \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\" (UID: \"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a\") " Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.739040 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-utilities" (OuterVolumeSpecName: "utilities") pod "57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" (UID: "57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.744814 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-kube-api-access-gxl7f" (OuterVolumeSpecName: "kube-api-access-gxl7f") pod "57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" (UID: "57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a"). InnerVolumeSpecName "kube-api-access-gxl7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.831857 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" (UID: "57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.839710 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.839765 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxl7f\" (UniqueName: \"kubernetes.io/projected/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-kube-api-access-gxl7f\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:06 crc kubenswrapper[4860]: I0320 11:44:06.839776 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.235409 4860 generic.go:334] "Generic (PLEG): container finished" podID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerID="d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c" exitCode=0 Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.235471 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d78wv" event={"ID":"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a","Type":"ContainerDied","Data":"d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c"} Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.235513 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d78wv" event={"ID":"57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a","Type":"ContainerDied","Data":"8c98d7e4ab87dca42cddfef62a946a203e0380f2d3bb440c1fe269c8f7a89579"} Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.235542 4860 scope.go:117] "RemoveContainer" containerID="d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.235553 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d78wv" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.263802 4860 scope.go:117] "RemoveContainer" containerID="0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.274499 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d78wv"] Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.282690 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d78wv"] Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.302982 4860 scope.go:117] "RemoveContainer" containerID="475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.323393 4860 scope.go:117] "RemoveContainer" containerID="d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c" Mar 20 11:44:07 crc kubenswrapper[4860]: E0320 11:44:07.325153 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c\": container with ID starting with d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c not found: ID does not exist" containerID="d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.325195 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c"} err="failed to get container status \"d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c\": rpc error: code = NotFound desc = could not find container \"d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c\": container with ID starting with d1282cf4606bd21a07325c48667aa660023db3bb611e39a453b19febefbd717c not found: ID does not exist" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.325235 4860 scope.go:117] "RemoveContainer" containerID="0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176" Mar 20 11:44:07 crc kubenswrapper[4860]: E0320 11:44:07.325729 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176\": container with ID starting with 0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176 not found: ID does not exist" containerID="0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.325760 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176"} err="failed to get container status \"0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176\": rpc error: code = NotFound desc = could not find container \"0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176\": container with ID starting with 0c4ff035ec5a56ad01f329fb002f9a6ff79489a6fd1e4b8b2a20d7a28d179176 not found: ID does not exist" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.325778 4860 scope.go:117] "RemoveContainer" containerID="475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182" Mar 20 11:44:07 crc kubenswrapper[4860]: E0320 11:44:07.326116 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182\": container with ID starting with 475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182 not found: ID does not exist" containerID="475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.326139 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182"} err="failed to get container status \"475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182\": rpc error: code = NotFound desc = could not find container \"475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182\": container with ID starting with 475e3880bff1e3fbf576238a9c0745442d7100eea10561e119a99ffa1fd6e182 not found: ID does not exist" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.428129 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04737161-e8a4-4231-8b96-1a617b9561a7" path="/var/lib/kubelet/pods/04737161-e8a4-4231-8b96-1a617b9561a7/volumes" Mar 20 11:44:07 crc kubenswrapper[4860]: I0320 11:44:07.430201 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" path="/var/lib/kubelet/pods/57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a/volumes" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.051383 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-597xm"] Mar 20 11:44:14 crc kubenswrapper[4860]: E0320 11:44:14.052815 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="extract-utilities" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.052833 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="extract-utilities" Mar 20 11:44:14 crc kubenswrapper[4860]: E0320 11:44:14.052866 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="extract-content" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.052872 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="extract-content" Mar 20 11:44:14 crc kubenswrapper[4860]: E0320 11:44:14.052886 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3485ef-9918-4aa6-80d1-c1c295d46ebe" containerName="oc" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.052897 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3485ef-9918-4aa6-80d1-c1c295d46ebe" containerName="oc" Mar 20 11:44:14 crc kubenswrapper[4860]: E0320 11:44:14.052914 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="registry-server" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.052920 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="registry-server" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.053082 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="57351cc0-fbb5-4fc5-a8c1-66b4d1cd702a" containerName="registry-server" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.053098 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3485ef-9918-4aa6-80d1-c1c295d46ebe" containerName="oc" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.054528 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.057350 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-utilities\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.057412 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-catalog-content\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.057435 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbl7l\" (UniqueName: \"kubernetes.io/projected/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-kube-api-access-tbl7l\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.065901 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-597xm"] Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.158769 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-catalog-content\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.158836 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbl7l\" (UniqueName: \"kubernetes.io/projected/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-kube-api-access-tbl7l\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.158923 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-utilities\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.159583 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-utilities\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.159687 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-catalog-content\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.183610 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbl7l\" (UniqueName: \"kubernetes.io/projected/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-kube-api-access-tbl7l\") pod \"certified-operators-597xm\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.377577 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:14 crc kubenswrapper[4860]: I0320 11:44:14.675968 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-597xm"] Mar 20 11:44:15 crc kubenswrapper[4860]: I0320 11:44:15.310390 4860 generic.go:334] "Generic (PLEG): container finished" podID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerID="13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d" exitCode=0 Mar 20 11:44:15 crc kubenswrapper[4860]: I0320 11:44:15.310456 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-597xm" event={"ID":"bdad63c3-dc15-41fd-acbc-6451e3dfea6b","Type":"ContainerDied","Data":"13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d"} Mar 20 11:44:15 crc kubenswrapper[4860]: I0320 11:44:15.310861 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-597xm" event={"ID":"bdad63c3-dc15-41fd-acbc-6451e3dfea6b","Type":"ContainerStarted","Data":"9386bd05b36fdd8edc89df0d5cc27cc7e138833813ff37cf522c6edb5dcf33b4"} Mar 20 11:44:16 crc kubenswrapper[4860]: I0320 11:44:16.320551 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-597xm" event={"ID":"bdad63c3-dc15-41fd-acbc-6451e3dfea6b","Type":"ContainerStarted","Data":"ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f"} Mar 20 11:44:17 crc kubenswrapper[4860]: I0320 11:44:17.336690 4860 generic.go:334] "Generic (PLEG): container finished" podID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerID="ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f" exitCode=0 Mar 20 11:44:17 crc kubenswrapper[4860]: I0320 11:44:17.336746 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-597xm" event={"ID":"bdad63c3-dc15-41fd-acbc-6451e3dfea6b","Type":"ContainerDied","Data":"ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f"} Mar 20 11:44:18 crc kubenswrapper[4860]: I0320 11:44:18.348751 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-597xm" event={"ID":"bdad63c3-dc15-41fd-acbc-6451e3dfea6b","Type":"ContainerStarted","Data":"46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0"} Mar 20 11:44:18 crc kubenswrapper[4860]: I0320 11:44:18.369925 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-597xm" podStartSLOduration=1.8734772849999999 podStartE2EDuration="4.369896927s" podCreationTimestamp="2026-03-20 11:44:14 +0000 UTC" firstStartedPulling="2026-03-20 11:44:15.312115844 +0000 UTC m=+2979.533476742" lastFinishedPulling="2026-03-20 11:44:17.808535486 +0000 UTC m=+2982.029896384" observedRunningTime="2026-03-20 11:44:18.366707 +0000 UTC m=+2982.588067908" watchObservedRunningTime="2026-03-20 11:44:18.369896927 +0000 UTC m=+2982.591257825" Mar 20 11:44:20 crc kubenswrapper[4860]: I0320 11:44:20.413677 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:44:20 crc kubenswrapper[4860]: E0320 11:44:20.414530 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:44:24 crc kubenswrapper[4860]: I0320 11:44:24.378574 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:24 crc kubenswrapper[4860]: I0320 11:44:24.379278 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:24 crc kubenswrapper[4860]: I0320 11:44:24.421797 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:24 crc kubenswrapper[4860]: I0320 11:44:24.471583 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:24 crc kubenswrapper[4860]: I0320 11:44:24.753386 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-597xm"] Mar 20 11:44:26 crc kubenswrapper[4860]: I0320 11:44:26.407844 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-597xm" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="registry-server" containerID="cri-o://46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0" gracePeriod=2 Mar 20 11:44:26 crc kubenswrapper[4860]: I0320 11:44:26.814861 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:26 crc kubenswrapper[4860]: I0320 11:44:26.966172 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbl7l\" (UniqueName: \"kubernetes.io/projected/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-kube-api-access-tbl7l\") pod \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " Mar 20 11:44:26 crc kubenswrapper[4860]: I0320 11:44:26.966837 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-catalog-content\") pod \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " Mar 20 11:44:26 crc kubenswrapper[4860]: I0320 11:44:26.966867 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-utilities\") pod \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\" (UID: \"bdad63c3-dc15-41fd-acbc-6451e3dfea6b\") " Mar 20 11:44:26 crc kubenswrapper[4860]: I0320 11:44:26.968717 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-utilities" (OuterVolumeSpecName: "utilities") pod "bdad63c3-dc15-41fd-acbc-6451e3dfea6b" (UID: "bdad63c3-dc15-41fd-acbc-6451e3dfea6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:44:26 crc kubenswrapper[4860]: I0320 11:44:26.987010 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-kube-api-access-tbl7l" (OuterVolumeSpecName: "kube-api-access-tbl7l") pod "bdad63c3-dc15-41fd-acbc-6451e3dfea6b" (UID: "bdad63c3-dc15-41fd-acbc-6451e3dfea6b"). InnerVolumeSpecName "kube-api-access-tbl7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.033554 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdad63c3-dc15-41fd-acbc-6451e3dfea6b" (UID: "bdad63c3-dc15-41fd-acbc-6451e3dfea6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.069304 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.069367 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbl7l\" (UniqueName: \"kubernetes.io/projected/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-kube-api-access-tbl7l\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.069388 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdad63c3-dc15-41fd-acbc-6451e3dfea6b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.419765 4860 generic.go:334] "Generic (PLEG): container finished" podID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerID="46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0" exitCode=0 Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.419902 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-597xm" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.432823 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-597xm" event={"ID":"bdad63c3-dc15-41fd-acbc-6451e3dfea6b","Type":"ContainerDied","Data":"46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0"} Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.433654 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-597xm" event={"ID":"bdad63c3-dc15-41fd-acbc-6451e3dfea6b","Type":"ContainerDied","Data":"9386bd05b36fdd8edc89df0d5cc27cc7e138833813ff37cf522c6edb5dcf33b4"} Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.433757 4860 scope.go:117] "RemoveContainer" containerID="46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.459127 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-597xm"] Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.462306 4860 scope.go:117] "RemoveContainer" containerID="ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.466261 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-597xm"] Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.484240 4860 scope.go:117] "RemoveContainer" containerID="13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.507370 4860 scope.go:117] "RemoveContainer" containerID="46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0" Mar 20 11:44:27 crc kubenswrapper[4860]: E0320 11:44:27.508039 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0\": container with ID starting with 46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0 not found: ID does not exist" containerID="46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.508117 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0"} err="failed to get container status \"46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0\": rpc error: code = NotFound desc = could not find container \"46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0\": container with ID starting with 46c422de9458da7fb851166206fa809911d89a7a675245c339fc28c34a21afe0 not found: ID does not exist" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.508150 4860 scope.go:117] "RemoveContainer" containerID="ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f" Mar 20 11:44:27 crc kubenswrapper[4860]: E0320 11:44:27.508569 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f\": container with ID starting with ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f not found: ID does not exist" containerID="ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.508613 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f"} err="failed to get container status \"ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f\": rpc error: code = NotFound desc = could not find container \"ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f\": container with ID starting with ef241d624d312d6344fcff1c6d1c05806f8145d6ba265be85dc92f64afaeb60f not found: ID does not exist" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.508638 4860 scope.go:117] "RemoveContainer" containerID="13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d" Mar 20 11:44:27 crc kubenswrapper[4860]: E0320 11:44:27.509176 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d\": container with ID starting with 13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d not found: ID does not exist" containerID="13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d" Mar 20 11:44:27 crc kubenswrapper[4860]: I0320 11:44:27.509255 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d"} err="failed to get container status \"13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d\": rpc error: code = NotFound desc = could not find container \"13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d\": container with ID starting with 13b6cbc006b6167fa239b630f0f8692af88a74eeaf7e0b3a96f14e82af19a24d not found: ID does not exist" Mar 20 11:44:29 crc kubenswrapper[4860]: I0320 11:44:29.423011 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" path="/var/lib/kubelet/pods/bdad63c3-dc15-41fd-acbc-6451e3dfea6b/volumes" Mar 20 11:44:34 crc kubenswrapper[4860]: I0320 11:44:34.413136 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:44:34 crc kubenswrapper[4860]: E0320 11:44:34.414319 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:44:46 crc kubenswrapper[4860]: I0320 11:44:46.414956 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:44:46 crc kubenswrapper[4860]: E0320 11:44:46.416193 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:44:49 crc kubenswrapper[4860]: I0320 11:44:49.327540 4860 scope.go:117] "RemoveContainer" containerID="2bf13e1cbb626df84de24a90ddc00424f7dbac653c634127ff56b49722ddadfd" Mar 20 11:44:57 crc kubenswrapper[4860]: I0320 11:44:57.413668 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:44:57 crc kubenswrapper[4860]: E0320 11:44:57.414404 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.157108 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z"] Mar 20 11:45:00 crc kubenswrapper[4860]: E0320 11:45:00.158101 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="extract-utilities" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.158122 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="extract-utilities" Mar 20 11:45:00 crc kubenswrapper[4860]: E0320 11:45:00.158155 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="registry-server" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.158164 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="registry-server" Mar 20 11:45:00 crc kubenswrapper[4860]: E0320 11:45:00.158178 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="extract-content" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.158187 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="extract-content" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.158380 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdad63c3-dc15-41fd-acbc-6451e3dfea6b" containerName="registry-server" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.160110 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.162954 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.163787 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.170381 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z"] Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.262826 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nktf4\" (UniqueName: \"kubernetes.io/projected/7cd4763a-4d0b-4052-a286-b5bfa32a2712-kube-api-access-nktf4\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.263392 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cd4763a-4d0b-4052-a286-b5bfa32a2712-secret-volume\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.263554 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cd4763a-4d0b-4052-a286-b5bfa32a2712-config-volume\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.365014 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cd4763a-4d0b-4052-a286-b5bfa32a2712-secret-volume\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.365079 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cd4763a-4d0b-4052-a286-b5bfa32a2712-config-volume\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.365117 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nktf4\" (UniqueName: \"kubernetes.io/projected/7cd4763a-4d0b-4052-a286-b5bfa32a2712-kube-api-access-nktf4\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.366367 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cd4763a-4d0b-4052-a286-b5bfa32a2712-config-volume\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.373463 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cd4763a-4d0b-4052-a286-b5bfa32a2712-secret-volume\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.385446 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nktf4\" (UniqueName: \"kubernetes.io/projected/7cd4763a-4d0b-4052-a286-b5bfa32a2712-kube-api-access-nktf4\") pod \"collect-profiles-29566785-sxf4z\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.494395 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.767763 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z"] Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.968096 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" event={"ID":"7cd4763a-4d0b-4052-a286-b5bfa32a2712","Type":"ContainerStarted","Data":"06e03ab518047b3abf690513eb34502d3f746664d0cdfa703b39a226ba064688"} Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.968171 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" event={"ID":"7cd4763a-4d0b-4052-a286-b5bfa32a2712","Type":"ContainerStarted","Data":"f104a3386c9ae8e14c7c7d1f332bbfa683c6e689a699938e2a44536f168b9259"} Mar 20 11:45:00 crc kubenswrapper[4860]: I0320 11:45:00.993266 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" podStartSLOduration=0.993223547 podStartE2EDuration="993.223547ms" podCreationTimestamp="2026-03-20 11:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:45:00.985263282 +0000 UTC m=+3025.206624200" watchObservedRunningTime="2026-03-20 11:45:00.993223547 +0000 UTC m=+3025.214584445" Mar 20 11:45:01 crc kubenswrapper[4860]: I0320 11:45:01.978716 4860 generic.go:334] "Generic (PLEG): container finished" podID="7cd4763a-4d0b-4052-a286-b5bfa32a2712" containerID="06e03ab518047b3abf690513eb34502d3f746664d0cdfa703b39a226ba064688" exitCode=0 Mar 20 11:45:01 crc kubenswrapper[4860]: I0320 11:45:01.978797 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" event={"ID":"7cd4763a-4d0b-4052-a286-b5bfa32a2712","Type":"ContainerDied","Data":"06e03ab518047b3abf690513eb34502d3f746664d0cdfa703b39a226ba064688"} Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.263975 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.413777 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cd4763a-4d0b-4052-a286-b5bfa32a2712-config-volume\") pod \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.413921 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nktf4\" (UniqueName: \"kubernetes.io/projected/7cd4763a-4d0b-4052-a286-b5bfa32a2712-kube-api-access-nktf4\") pod \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.414019 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cd4763a-4d0b-4052-a286-b5bfa32a2712-secret-volume\") pod \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\" (UID: \"7cd4763a-4d0b-4052-a286-b5bfa32a2712\") " Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.414380 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cd4763a-4d0b-4052-a286-b5bfa32a2712-config-volume" (OuterVolumeSpecName: "config-volume") pod "7cd4763a-4d0b-4052-a286-b5bfa32a2712" (UID: "7cd4763a-4d0b-4052-a286-b5bfa32a2712"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.421624 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd4763a-4d0b-4052-a286-b5bfa32a2712-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7cd4763a-4d0b-4052-a286-b5bfa32a2712" (UID: "7cd4763a-4d0b-4052-a286-b5bfa32a2712"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.422806 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cd4763a-4d0b-4052-a286-b5bfa32a2712-kube-api-access-nktf4" (OuterVolumeSpecName: "kube-api-access-nktf4") pod "7cd4763a-4d0b-4052-a286-b5bfa32a2712" (UID: "7cd4763a-4d0b-4052-a286-b5bfa32a2712"). InnerVolumeSpecName "kube-api-access-nktf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.516499 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nktf4\" (UniqueName: \"kubernetes.io/projected/7cd4763a-4d0b-4052-a286-b5bfa32a2712-kube-api-access-nktf4\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.516548 4860 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cd4763a-4d0b-4052-a286-b5bfa32a2712-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.516561 4860 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cd4763a-4d0b-4052-a286-b5bfa32a2712-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.997865 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" event={"ID":"7cd4763a-4d0b-4052-a286-b5bfa32a2712","Type":"ContainerDied","Data":"f104a3386c9ae8e14c7c7d1f332bbfa683c6e689a699938e2a44536f168b9259"} Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.997940 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f104a3386c9ae8e14c7c7d1f332bbfa683c6e689a699938e2a44536f168b9259" Mar 20 11:45:03 crc kubenswrapper[4860]: I0320 11:45:03.997953 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-sxf4z" Mar 20 11:45:04 crc kubenswrapper[4860]: I0320 11:45:04.347896 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq"] Mar 20 11:45:04 crc kubenswrapper[4860]: I0320 11:45:04.353042 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-627pq"] Mar 20 11:45:05 crc kubenswrapper[4860]: I0320 11:45:05.424759 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41a09ead-8137-4791-896c-c5a9cad7f4cf" path="/var/lib/kubelet/pods/41a09ead-8137-4791-896c-c5a9cad7f4cf/volumes" Mar 20 11:45:09 crc kubenswrapper[4860]: I0320 11:45:09.413241 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:45:09 crc kubenswrapper[4860]: E0320 11:45:09.414264 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:45:24 crc kubenswrapper[4860]: I0320 11:45:24.414107 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:45:24 crc kubenswrapper[4860]: E0320 11:45:24.416444 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:45:39 crc kubenswrapper[4860]: I0320 11:45:39.415014 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:45:39 crc kubenswrapper[4860]: E0320 11:45:39.416803 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:45:49 crc kubenswrapper[4860]: I0320 11:45:49.424944 4860 scope.go:117] "RemoveContainer" containerID="728de8ccc22f402da25ca09407c17b66749c7ba40a4b7eb4c5cb707fe2325a9c" Mar 20 11:45:50 crc kubenswrapper[4860]: I0320 11:45:50.414064 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:45:50 crc kubenswrapper[4860]: E0320 11:45:50.414816 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.153727 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566786-6kgpn"] Mar 20 11:46:00 crc kubenswrapper[4860]: E0320 11:46:00.158118 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd4763a-4d0b-4052-a286-b5bfa32a2712" containerName="collect-profiles" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.158185 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd4763a-4d0b-4052-a286-b5bfa32a2712" containerName="collect-profiles" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.158524 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd4763a-4d0b-4052-a286-b5bfa32a2712" containerName="collect-profiles" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.159332 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-6kgpn" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.162263 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.162462 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.163883 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.164292 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-6kgpn"] Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.318891 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jjbv\" (UniqueName: \"kubernetes.io/projected/a69dccb3-324d-47b3-92d0-af9fc224932d-kube-api-access-6jjbv\") pod \"auto-csr-approver-29566786-6kgpn\" (UID: \"a69dccb3-324d-47b3-92d0-af9fc224932d\") " pod="openshift-infra/auto-csr-approver-29566786-6kgpn" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.420916 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jjbv\" (UniqueName: \"kubernetes.io/projected/a69dccb3-324d-47b3-92d0-af9fc224932d-kube-api-access-6jjbv\") pod \"auto-csr-approver-29566786-6kgpn\" (UID: \"a69dccb3-324d-47b3-92d0-af9fc224932d\") " pod="openshift-infra/auto-csr-approver-29566786-6kgpn" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.448343 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jjbv\" (UniqueName: \"kubernetes.io/projected/a69dccb3-324d-47b3-92d0-af9fc224932d-kube-api-access-6jjbv\") pod \"auto-csr-approver-29566786-6kgpn\" (UID: \"a69dccb3-324d-47b3-92d0-af9fc224932d\") " pod="openshift-infra/auto-csr-approver-29566786-6kgpn" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.486501 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-6kgpn" Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.921293 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-6kgpn"] Mar 20 11:46:00 crc kubenswrapper[4860]: I0320 11:46:00.928527 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:46:01 crc kubenswrapper[4860]: I0320 11:46:01.474681 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-6kgpn" event={"ID":"a69dccb3-324d-47b3-92d0-af9fc224932d","Type":"ContainerStarted","Data":"1c358413d1a4a857bdf34b8955e68600db7d6770a01e0d6e798d3d144dbdea0b"} Mar 20 11:46:02 crc kubenswrapper[4860]: I0320 11:46:02.413925 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:46:02 crc kubenswrapper[4860]: E0320 11:46:02.414864 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:46:02 crc kubenswrapper[4860]: I0320 11:46:02.485539 4860 generic.go:334] "Generic (PLEG): container finished" podID="a69dccb3-324d-47b3-92d0-af9fc224932d" containerID="02ca8def8758e2ad1b605230bcb844ea2d285141ff0c9b3e5a91bad1e50bf67e" exitCode=0 Mar 20 11:46:02 crc kubenswrapper[4860]: I0320 11:46:02.485630 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-6kgpn" event={"ID":"a69dccb3-324d-47b3-92d0-af9fc224932d","Type":"ContainerDied","Data":"02ca8def8758e2ad1b605230bcb844ea2d285141ff0c9b3e5a91bad1e50bf67e"} Mar 20 11:46:03 crc kubenswrapper[4860]: I0320 11:46:03.758547 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-6kgpn" Mar 20 11:46:03 crc kubenswrapper[4860]: I0320 11:46:03.879707 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jjbv\" (UniqueName: \"kubernetes.io/projected/a69dccb3-324d-47b3-92d0-af9fc224932d-kube-api-access-6jjbv\") pod \"a69dccb3-324d-47b3-92d0-af9fc224932d\" (UID: \"a69dccb3-324d-47b3-92d0-af9fc224932d\") " Mar 20 11:46:03 crc kubenswrapper[4860]: I0320 11:46:03.886148 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69dccb3-324d-47b3-92d0-af9fc224932d-kube-api-access-6jjbv" (OuterVolumeSpecName: "kube-api-access-6jjbv") pod "a69dccb3-324d-47b3-92d0-af9fc224932d" (UID: "a69dccb3-324d-47b3-92d0-af9fc224932d"). InnerVolumeSpecName "kube-api-access-6jjbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:46:03 crc kubenswrapper[4860]: I0320 11:46:03.981521 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jjbv\" (UniqueName: \"kubernetes.io/projected/a69dccb3-324d-47b3-92d0-af9fc224932d-kube-api-access-6jjbv\") on node \"crc\" DevicePath \"\"" Mar 20 11:46:04 crc kubenswrapper[4860]: I0320 11:46:04.501249 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-6kgpn" event={"ID":"a69dccb3-324d-47b3-92d0-af9fc224932d","Type":"ContainerDied","Data":"1c358413d1a4a857bdf34b8955e68600db7d6770a01e0d6e798d3d144dbdea0b"} Mar 20 11:46:04 crc kubenswrapper[4860]: I0320 11:46:04.501573 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c358413d1a4a857bdf34b8955e68600db7d6770a01e0d6e798d3d144dbdea0b" Mar 20 11:46:04 crc kubenswrapper[4860]: I0320 11:46:04.501347 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-6kgpn" Mar 20 11:46:04 crc kubenswrapper[4860]: I0320 11:46:04.843278 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-dpx82"] Mar 20 11:46:04 crc kubenswrapper[4860]: I0320 11:46:04.849317 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-dpx82"] Mar 20 11:46:05 crc kubenswrapper[4860]: I0320 11:46:05.426007 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f75e65d-773b-4474-985a-2ca6fea0dc6a" path="/var/lib/kubelet/pods/0f75e65d-773b-4474-985a-2ca6fea0dc6a/volumes" Mar 20 11:46:13 crc kubenswrapper[4860]: I0320 11:46:13.413802 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:46:13 crc kubenswrapper[4860]: E0320 11:46:13.415178 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:46:28 crc kubenswrapper[4860]: I0320 11:46:28.413529 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:46:28 crc kubenswrapper[4860]: E0320 11:46:28.416673 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:46:40 crc kubenswrapper[4860]: I0320 11:46:40.413880 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:46:40 crc kubenswrapper[4860]: E0320 11:46:40.415096 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:46:49 crc kubenswrapper[4860]: I0320 11:46:49.489121 4860 scope.go:117] "RemoveContainer" containerID="7535e8513557739754c0e4889cb6b1e5e63acd730d71679b5f7aab2ccfb3bbbd" Mar 20 11:46:55 crc kubenswrapper[4860]: I0320 11:46:55.413961 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:46:55 crc kubenswrapper[4860]: E0320 11:46:55.416496 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.335085 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-972zc/must-gather-g5zkg"] Mar 20 11:47:02 crc kubenswrapper[4860]: E0320 11:47:02.336034 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69dccb3-324d-47b3-92d0-af9fc224932d" containerName="oc" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.336047 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69dccb3-324d-47b3-92d0-af9fc224932d" containerName="oc" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.336193 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69dccb3-324d-47b3-92d0-af9fc224932d" containerName="oc" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.337002 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.343505 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-972zc"/"kube-root-ca.crt" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.345005 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-972zc"/"openshift-service-ca.crt" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.444009 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-972zc/must-gather-g5zkg"] Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.450206 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbg2s\" (UniqueName: \"kubernetes.io/projected/537f47a7-01d4-449a-8afc-a83a212f4bc5-kube-api-access-dbg2s\") pod \"must-gather-g5zkg\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.450433 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/537f47a7-01d4-449a-8afc-a83a212f4bc5-must-gather-output\") pod \"must-gather-g5zkg\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.551980 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbg2s\" (UniqueName: \"kubernetes.io/projected/537f47a7-01d4-449a-8afc-a83a212f4bc5-kube-api-access-dbg2s\") pod \"must-gather-g5zkg\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.552105 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/537f47a7-01d4-449a-8afc-a83a212f4bc5-must-gather-output\") pod \"must-gather-g5zkg\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.552740 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/537f47a7-01d4-449a-8afc-a83a212f4bc5-must-gather-output\") pod \"must-gather-g5zkg\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.573558 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbg2s\" (UniqueName: \"kubernetes.io/projected/537f47a7-01d4-449a-8afc-a83a212f4bc5-kube-api-access-dbg2s\") pod \"must-gather-g5zkg\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.658039 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:47:02 crc kubenswrapper[4860]: I0320 11:47:02.965741 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-972zc/must-gather-g5zkg"] Mar 20 11:47:02 crc kubenswrapper[4860]: W0320 11:47:02.980731 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod537f47a7_01d4_449a_8afc_a83a212f4bc5.slice/crio-cf9211712c83fdbaf339ec06742e20128d2665dc69097adadd2b0df6ade5b939 WatchSource:0}: Error finding container cf9211712c83fdbaf339ec06742e20128d2665dc69097adadd2b0df6ade5b939: Status 404 returned error can't find the container with id cf9211712c83fdbaf339ec06742e20128d2665dc69097adadd2b0df6ade5b939 Mar 20 11:47:03 crc kubenswrapper[4860]: I0320 11:47:03.989249 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-972zc/must-gather-g5zkg" event={"ID":"537f47a7-01d4-449a-8afc-a83a212f4bc5","Type":"ContainerStarted","Data":"cf9211712c83fdbaf339ec06742e20128d2665dc69097adadd2b0df6ade5b939"} Mar 20 11:47:07 crc kubenswrapper[4860]: I0320 11:47:07.419388 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:47:07 crc kubenswrapper[4860]: E0320 11:47:07.420705 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:47:11 crc kubenswrapper[4860]: I0320 11:47:11.072792 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-972zc/must-gather-g5zkg" event={"ID":"537f47a7-01d4-449a-8afc-a83a212f4bc5","Type":"ContainerStarted","Data":"e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba"} Mar 20 11:47:11 crc kubenswrapper[4860]: I0320 11:47:11.073716 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-972zc/must-gather-g5zkg" event={"ID":"537f47a7-01d4-449a-8afc-a83a212f4bc5","Type":"ContainerStarted","Data":"fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b"} Mar 20 11:47:12 crc kubenswrapper[4860]: I0320 11:47:12.099387 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-972zc/must-gather-g5zkg" podStartSLOduration=2.451706799 podStartE2EDuration="10.099363633s" podCreationTimestamp="2026-03-20 11:47:02 +0000 UTC" firstStartedPulling="2026-03-20 11:47:02.982961748 +0000 UTC m=+3147.204322646" lastFinishedPulling="2026-03-20 11:47:10.630618582 +0000 UTC m=+3154.851979480" observedRunningTime="2026-03-20 11:47:12.095743475 +0000 UTC m=+3156.317104383" watchObservedRunningTime="2026-03-20 11:47:12.099363633 +0000 UTC m=+3156.320724521" Mar 20 11:47:18 crc kubenswrapper[4860]: I0320 11:47:18.414272 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:47:18 crc kubenswrapper[4860]: E0320 11:47:18.415434 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:47:33 crc kubenswrapper[4860]: I0320 11:47:33.414249 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:47:33 crc kubenswrapper[4860]: E0320 11:47:33.415591 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:47:48 crc kubenswrapper[4860]: I0320 11:47:48.413582 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:47:48 crc kubenswrapper[4860]: E0320 11:47:48.414732 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.120820 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78dd6ddcc-sjw98_ccb7e541-f715-4030-8091-91f7e9eacb4c/init/0.log" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.154317 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566788-dlb4m"] Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.155714 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.160282 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.160498 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.161093 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.173597 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-dlb4m"] Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.262582 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j5jj\" (UniqueName: \"kubernetes.io/projected/0f342006-66c4-4bc6-9577-1aa4db4b4210-kube-api-access-7j5jj\") pod \"auto-csr-approver-29566788-dlb4m\" (UID: \"0f342006-66c4-4bc6-9577-1aa4db4b4210\") " pod="openshift-infra/auto-csr-approver-29566788-dlb4m" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.359285 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78dd6ddcc-sjw98_ccb7e541-f715-4030-8091-91f7e9eacb4c/init/0.log" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.363530 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78dd6ddcc-sjw98_ccb7e541-f715-4030-8091-91f7e9eacb4c/dnsmasq-dns/0.log" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.364750 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j5jj\" (UniqueName: \"kubernetes.io/projected/0f342006-66c4-4bc6-9577-1aa4db4b4210-kube-api-access-7j5jj\") pod \"auto-csr-approver-29566788-dlb4m\" (UID: \"0f342006-66c4-4bc6-9577-1aa4db4b4210\") " pod="openshift-infra/auto-csr-approver-29566788-dlb4m" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.392825 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j5jj\" (UniqueName: \"kubernetes.io/projected/0f342006-66c4-4bc6-9577-1aa4db4b4210-kube-api-access-7j5jj\") pod \"auto-csr-approver-29566788-dlb4m\" (UID: \"0f342006-66c4-4bc6-9577-1aa4db4b4210\") " pod="openshift-infra/auto-csr-approver-29566788-dlb4m" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.482044 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" Mar 20 11:48:00 crc kubenswrapper[4860]: I0320 11:48:00.775210 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-dlb4m"] Mar 20 11:48:00 crc kubenswrapper[4860]: W0320 11:48:00.786204 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f342006_66c4_4bc6_9577_1aa4db4b4210.slice/crio-57ac84480514a47c7671f6d10def87abaaf2b0b0003264f548db5f1f5c7cb2df WatchSource:0}: Error finding container 57ac84480514a47c7671f6d10def87abaaf2b0b0003264f548db5f1f5c7cb2df: Status 404 returned error can't find the container with id 57ac84480514a47c7671f6d10def87abaaf2b0b0003264f548db5f1f5c7cb2df Mar 20 11:48:01 crc kubenswrapper[4860]: I0320 11:48:01.656528 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" event={"ID":"0f342006-66c4-4bc6-9577-1aa4db4b4210","Type":"ContainerStarted","Data":"57ac84480514a47c7671f6d10def87abaaf2b0b0003264f548db5f1f5c7cb2df"} Mar 20 11:48:02 crc kubenswrapper[4860]: I0320 11:48:02.666387 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" event={"ID":"0f342006-66c4-4bc6-9577-1aa4db4b4210","Type":"ContainerStarted","Data":"e6d46e2bb0f38724bb3cbbcdce2b9456f7002fbe4fd812111c54f54bb22a93b2"} Mar 20 11:48:02 crc kubenswrapper[4860]: I0320 11:48:02.696777 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" podStartSLOduration=1.64140723 podStartE2EDuration="2.696753959s" podCreationTimestamp="2026-03-20 11:48:00 +0000 UTC" firstStartedPulling="2026-03-20 11:48:00.789023376 +0000 UTC m=+3205.010384274" lastFinishedPulling="2026-03-20 11:48:01.844370105 +0000 UTC m=+3206.065731003" observedRunningTime="2026-03-20 11:48:02.687290304 +0000 UTC m=+3206.908651202" watchObservedRunningTime="2026-03-20 11:48:02.696753959 +0000 UTC m=+3206.918114857" Mar 20 11:48:03 crc kubenswrapper[4860]: I0320 11:48:03.414063 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:48:03 crc kubenswrapper[4860]: E0320 11:48:03.414321 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:48:03 crc kubenswrapper[4860]: I0320 11:48:03.676723 4860 generic.go:334] "Generic (PLEG): container finished" podID="0f342006-66c4-4bc6-9577-1aa4db4b4210" containerID="e6d46e2bb0f38724bb3cbbcdce2b9456f7002fbe4fd812111c54f54bb22a93b2" exitCode=0 Mar 20 11:48:03 crc kubenswrapper[4860]: I0320 11:48:03.676791 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" event={"ID":"0f342006-66c4-4bc6-9577-1aa4db4b4210","Type":"ContainerDied","Data":"e6d46e2bb0f38724bb3cbbcdce2b9456f7002fbe4fd812111c54f54bb22a93b2"} Mar 20 11:48:04 crc kubenswrapper[4860]: I0320 11:48:04.981273 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.037922 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j5jj\" (UniqueName: \"kubernetes.io/projected/0f342006-66c4-4bc6-9577-1aa4db4b4210-kube-api-access-7j5jj\") pod \"0f342006-66c4-4bc6-9577-1aa4db4b4210\" (UID: \"0f342006-66c4-4bc6-9577-1aa4db4b4210\") " Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.046213 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f342006-66c4-4bc6-9577-1aa4db4b4210-kube-api-access-7j5jj" (OuterVolumeSpecName: "kube-api-access-7j5jj") pod "0f342006-66c4-4bc6-9577-1aa4db4b4210" (UID: "0f342006-66c4-4bc6-9577-1aa4db4b4210"). InnerVolumeSpecName "kube-api-access-7j5jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.140357 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j5jj\" (UniqueName: \"kubernetes.io/projected/0f342006-66c4-4bc6-9577-1aa4db4b4210-kube-api-access-7j5jj\") on node \"crc\" DevicePath \"\"" Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.694425 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" event={"ID":"0f342006-66c4-4bc6-9577-1aa4db4b4210","Type":"ContainerDied","Data":"57ac84480514a47c7671f6d10def87abaaf2b0b0003264f548db5f1f5c7cb2df"} Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.694494 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57ac84480514a47c7671f6d10def87abaaf2b0b0003264f548db5f1f5c7cb2df" Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.694518 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-dlb4m" Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.781745 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-lzh8h"] Mar 20 11:48:05 crc kubenswrapper[4860]: I0320 11:48:05.788660 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-lzh8h"] Mar 20 11:48:07 crc kubenswrapper[4860]: I0320 11:48:07.424030 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc74c40-1bf8-47fb-91a4-a6e27724dff9" path="/var/lib/kubelet/pods/cfc74c40-1bf8-47fb-91a4-a6e27724dff9/volumes" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.246028 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp_c6145c03-dfdd-4224-b2b0-6087b1f137d1/util/0.log" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.465986 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp_c6145c03-dfdd-4224-b2b0-6087b1f137d1/util/0.log" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.467568 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp_c6145c03-dfdd-4224-b2b0-6087b1f137d1/pull/0.log" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.469078 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp_c6145c03-dfdd-4224-b2b0-6087b1f137d1/pull/0.log" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.715135 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp_c6145c03-dfdd-4224-b2b0-6087b1f137d1/pull/0.log" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.719727 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp_c6145c03-dfdd-4224-b2b0-6087b1f137d1/util/0.log" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.720411 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858pzhxp_c6145c03-dfdd-4224-b2b0-6087b1f137d1/extract/0.log" Mar 20 11:48:15 crc kubenswrapper[4860]: I0320 11:48:15.905745 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-8dh72_8b4d2530-4f67-45e8-9444-bea25fdad6ae/manager/0.log" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.167336 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-2692b_20d35dc6-0fc2-4651-9dcd-855814132a5f/manager/0.log" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.310642 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-zphz9_5fdbc315-f7fd-47ac-aa39-fdbe068f6f3b/manager/0.log" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.413259 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:48:16 crc kubenswrapper[4860]: E0320 11:48:16.413556 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.508173 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-vw2d9_36138670-7449-4d49-8a23-73b57d10b67f/manager/0.log" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.568026 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-s2kwq_178fff2d-699c-4cab-8626-3e30a6bd9ed6/manager/0.log" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.683605 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-wfczk_c54f27c4-bd61-4bad-bf91-376fee65d219/manager/0.log" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.833873 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-669fff9c7c-njzqs_70703379-8eb2-4f8a-95c8-302b53692a53/manager/0.log" Mar 20 11:48:16 crc kubenswrapper[4860]: I0320 11:48:16.900333 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-mc48w_acf57205-3b95-48a3-8222-1b57b0b6c54b/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.051262 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-pq75b_fbbe8243-9afb-4fc5-90f1-04d6f0c074ef/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.120420 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-pzk5m_0fe9b978-da91-4568-9b77-0d5930aca888/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.272917 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-m8948_d7202366-6dc1-45ca-bb9a-74bdd0426c5f/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.327064 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-2vsjq_29801d0c-963e-4b38-ad2d-8b03d3ade0be/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.526524 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-z8fp5_6c2530cf-70b4-4a89-acff-086b36773edf/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.581518 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-tjt52_431ab970-7f36-4ace-860c-479faac092a0/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.784440 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-5tqgx_ecf64e38-138d-4ef7-8b17-c09f30358f3e/manager/0.log" Mar 20 11:48:17 crc kubenswrapper[4860]: I0320 11:48:17.982948 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-846ffbb776-dppd5_31f3fcff-ca2c-40b5-bdf3-018132ccb63b/operator/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.089759 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6697dffbc-hpk42_84431296-0ca0-425a-8da8-c3ea46b08b29/manager/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.238193 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-82h6r_f7193309-39f9-4487-b02b-8e9e4d6a69ff/registry-server/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.308408 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-4nk5c_c736e6d7-6806-4ef3-a0b3-f1b17ab33037/manager/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.454209 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-4tdg4_7f73053a-86aa-42dc-bcca-ee26a4fda2e5/manager/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.515008 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-dvptb_cce5926a-9df6-4915-a94f-02cf2f74fccc/manager/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.689109 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-jd9bn_b5e881e2-f657-418f-ba87-7074722307a2/manager/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.752308 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-b4zcf_f329ab6d-5c8c-4ed2-a830-d0a04bb31071/manager/0.log" Mar 20 11:48:18 crc kubenswrapper[4860]: I0320 11:48:18.922916 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-ncmzn_1723efcf-97d7-4101-a15d-d4776d45d29b/manager/0.log" Mar 20 11:48:28 crc kubenswrapper[4860]: I0320 11:48:28.414042 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:48:28 crc kubenswrapper[4860]: E0320 11:48:28.415147 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:48:38 crc kubenswrapper[4860]: I0320 11:48:38.572763 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jnqsw_8bc351c5-b724-443e-a7e2-f4abba352cef/control-plane-machine-set-operator/0.log" Mar 20 11:48:38 crc kubenswrapper[4860]: I0320 11:48:38.745972 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-s52jd_d4ce1856-395a-4003-9642-61da7cbdd789/kube-rbac-proxy/0.log" Mar 20 11:48:38 crc kubenswrapper[4860]: I0320 11:48:38.805209 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-s52jd_d4ce1856-395a-4003-9642-61da7cbdd789/machine-api-operator/0.log" Mar 20 11:48:41 crc kubenswrapper[4860]: I0320 11:48:41.417620 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:48:41 crc kubenswrapper[4860]: E0320 11:48:41.418380 4860 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdqp_openshift-machine-config-operator(6a9df230-75a1-4b64-8d00-c179e9c19080)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" Mar 20 11:48:49 crc kubenswrapper[4860]: I0320 11:48:49.594173 4860 scope.go:117] "RemoveContainer" containerID="9f2e26a1e8f88c5ac76c8ad0792718b788b15449739711bc2614bdd4cd541855" Mar 20 11:48:51 crc kubenswrapper[4860]: I0320 11:48:51.535498 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-x6lwp_74ed19c1-0e46-4fed-b50f-155eaa38aed9/cert-manager-controller/0.log" Mar 20 11:48:51 crc kubenswrapper[4860]: I0320 11:48:51.734649 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-sjz7s_aa7d8eaa-20ac-4ea3-b19d-e8f89054c619/cert-manager-cainjector/0.log" Mar 20 11:48:51 crc kubenswrapper[4860]: I0320 11:48:51.861391 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-skhrl_4587c778-c12c-48e0-8c28-7eb7a7c1b722/cert-manager-webhook/0.log" Mar 20 11:48:53 crc kubenswrapper[4860]: I0320 11:48:53.414032 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:48:54 crc kubenswrapper[4860]: I0320 11:48:54.071665 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"27de1546d8fc12cd142537f3e435648e650a77886fa9d1677aedbb1fcd90c4bd"} Mar 20 11:49:04 crc kubenswrapper[4860]: I0320 11:49:04.685248 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-l98lx_a91c6f2b-7646-4f4d-bdc2-47304e36da4e/nmstate-console-plugin/0.log" Mar 20 11:49:04 crc kubenswrapper[4860]: I0320 11:49:04.877568 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mdh82_ef7f3b63-3a7d-483b-95c1-32961dad6226/nmstate-handler/0.log" Mar 20 11:49:05 crc kubenswrapper[4860]: I0320 11:49:05.015615 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wr9vc_6f56c0b5-3d27-49e6-af5b-6ad929d9e857/kube-rbac-proxy/0.log" Mar 20 11:49:05 crc kubenswrapper[4860]: I0320 11:49:05.043542 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wr9vc_6f56c0b5-3d27-49e6-af5b-6ad929d9e857/nmstate-metrics/0.log" Mar 20 11:49:05 crc kubenswrapper[4860]: I0320 11:49:05.154472 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-dmczs_ce7d9f29-28cd-4038-b492-b18e0b129907/nmstate-operator/0.log" Mar 20 11:49:05 crc kubenswrapper[4860]: I0320 11:49:05.260642 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-9cfpv_db5d41a4-2808-4189-8c3e-e0730cdf1a4f/nmstate-webhook/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.220962 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-tdcbt_a2a3b82e-416b-4757-8719-97c58493428e/kube-rbac-proxy/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.303506 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-tdcbt_a2a3b82e-416b-4757-8719-97c58493428e/controller/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.454444 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-frr-files/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.694458 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-frr-files/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.694581 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-metrics/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.707799 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-reloader/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.709526 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-reloader/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.962546 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-frr-files/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.966434 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-reloader/0.log" Mar 20 11:49:32 crc kubenswrapper[4860]: I0320 11:49:32.973038 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-metrics/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.019937 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-metrics/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.208540 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-frr-files/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.220687 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-metrics/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.224961 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/cp-reloader/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.251541 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/controller/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.417208 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/kube-rbac-proxy/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.437463 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/frr-metrics/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.508136 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/kube-rbac-proxy-frr/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.656398 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/reloader/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.789296 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jhncx_3c1b54d0-fcfb-451b-ae3d-b731d3f9f6de/frr-k8s-webhook-server/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.927248 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mzzpz_dcd69c7f-fded-4b09-bd44-607b27716196/frr/0.log" Mar 20 11:49:33 crc kubenswrapper[4860]: I0320 11:49:33.933441 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5c589f6ccd-bcpmf_bb8f951b-6aa9-420c-9bad-dfa857482d4c/manager/0.log" Mar 20 11:49:34 crc kubenswrapper[4860]: I0320 11:49:34.097010 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-587dc5fb9c-2t48s_1eb0189c-2177-4c4e-83f6-7ba051322847/webhook-server/0.log" Mar 20 11:49:34 crc kubenswrapper[4860]: I0320 11:49:34.126554 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-brjk7_6ee4e9c2-66c1-4431-bde4-29d09a044a32/kube-rbac-proxy/0.log" Mar 20 11:49:34 crc kubenswrapper[4860]: I0320 11:49:34.398302 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-brjk7_6ee4e9c2-66c1-4431-bde4-29d09a044a32/speaker/0.log" Mar 20 11:49:48 crc kubenswrapper[4860]: I0320 11:49:48.737695 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd_eb4de49a-fca6-4c8c-8484-461859f95884/util/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.005382 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd_eb4de49a-fca6-4c8c-8484-461859f95884/util/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.039258 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd_eb4de49a-fca6-4c8c-8484-461859f95884/pull/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.063122 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd_eb4de49a-fca6-4c8c-8484-461859f95884/pull/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.274745 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd_eb4de49a-fca6-4c8c-8484-461859f95884/util/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.277482 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd_eb4de49a-fca6-4c8c-8484-461859f95884/extract/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.313786 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8746cdjd_eb4de49a-fca6-4c8c-8484-461859f95884/pull/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.486553 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn_7da0294e-5ac5-4655-b882-cfd1f36ce791/util/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.655100 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn_7da0294e-5ac5-4655-b882-cfd1f36ce791/pull/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.695833 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn_7da0294e-5ac5-4655-b882-cfd1f36ce791/util/0.log" Mar 20 11:49:49 crc kubenswrapper[4860]: I0320 11:49:49.720266 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn_7da0294e-5ac5-4655-b882-cfd1f36ce791/pull/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.094702 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn_7da0294e-5ac5-4655-b882-cfd1f36ce791/extract/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.128712 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn_7da0294e-5ac5-4655-b882-cfd1f36ce791/util/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.135963 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sgngn_7da0294e-5ac5-4655-b882-cfd1f36ce791/pull/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.299795 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2_ea4498dd-681d-4260-b895-06e53dbcc9b9/util/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.482549 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2_ea4498dd-681d-4260-b895-06e53dbcc9b9/util/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.543083 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2_ea4498dd-681d-4260-b895-06e53dbcc9b9/pull/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.575827 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2_ea4498dd-681d-4260-b895-06e53dbcc9b9/pull/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.794127 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2_ea4498dd-681d-4260-b895-06e53dbcc9b9/util/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.822618 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2_ea4498dd-681d-4260-b895-06e53dbcc9b9/pull/0.log" Mar 20 11:49:50 crc kubenswrapper[4860]: I0320 11:49:50.840254 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5km9m2_ea4498dd-681d-4260-b895-06e53dbcc9b9/extract/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.000175 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-47qz8_a3a77828-39d7-4547-ba09-26a9a0fb8e7b/extract-utilities/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.220456 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-47qz8_a3a77828-39d7-4547-ba09-26a9a0fb8e7b/extract-content/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.220689 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-47qz8_a3a77828-39d7-4547-ba09-26a9a0fb8e7b/extract-content/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.240104 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-47qz8_a3a77828-39d7-4547-ba09-26a9a0fb8e7b/extract-utilities/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.424882 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-47qz8_a3a77828-39d7-4547-ba09-26a9a0fb8e7b/extract-utilities/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.507581 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-47qz8_a3a77828-39d7-4547-ba09-26a9a0fb8e7b/extract-content/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.717304 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nvhg7_4ab38144-c30d-4aed-884c-8ace682fe5ea/extract-utilities/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.967472 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nvhg7_4ab38144-c30d-4aed-884c-8ace682fe5ea/extract-content/0.log" Mar 20 11:49:51 crc kubenswrapper[4860]: I0320 11:49:51.978182 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nvhg7_4ab38144-c30d-4aed-884c-8ace682fe5ea/extract-utilities/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.045991 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nvhg7_4ab38144-c30d-4aed-884c-8ace682fe5ea/extract-content/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.073123 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-47qz8_a3a77828-39d7-4547-ba09-26a9a0fb8e7b/registry-server/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.258320 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nvhg7_4ab38144-c30d-4aed-884c-8ace682fe5ea/extract-content/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.268891 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nvhg7_4ab38144-c30d-4aed-884c-8ace682fe5ea/extract-utilities/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.528212 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nvhg7_4ab38144-c30d-4aed-884c-8ace682fe5ea/registry-server/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.545051 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qkfjv_489f9463-a47c-4635-aad3-866e47a2c97f/marketplace-operator/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.583893 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mhds4_f820689f-28ee-4cbe-bf7b-049d9ec6ef64/extract-utilities/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.795569 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mhds4_f820689f-28ee-4cbe-bf7b-049d9ec6ef64/extract-content/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.800670 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mhds4_f820689f-28ee-4cbe-bf7b-049d9ec6ef64/extract-content/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.802687 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mhds4_f820689f-28ee-4cbe-bf7b-049d9ec6ef64/extract-utilities/0.log" Mar 20 11:49:52 crc kubenswrapper[4860]: I0320 11:49:52.976629 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mhds4_f820689f-28ee-4cbe-bf7b-049d9ec6ef64/extract-utilities/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.046203 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mhds4_f820689f-28ee-4cbe-bf7b-049d9ec6ef64/extract-content/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.213744 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mhds4_f820689f-28ee-4cbe-bf7b-049d9ec6ef64/registry-server/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.286476 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zrj5v_8d34a762-55ad-41cb-994e-d4707bfebe22/extract-utilities/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.485502 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zrj5v_8d34a762-55ad-41cb-994e-d4707bfebe22/extract-utilities/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.517136 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zrj5v_8d34a762-55ad-41cb-994e-d4707bfebe22/extract-content/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.532345 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zrj5v_8d34a762-55ad-41cb-994e-d4707bfebe22/extract-content/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.884123 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zrj5v_8d34a762-55ad-41cb-994e-d4707bfebe22/extract-utilities/0.log" Mar 20 11:49:53 crc kubenswrapper[4860]: I0320 11:49:53.904733 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zrj5v_8d34a762-55ad-41cb-994e-d4707bfebe22/extract-content/0.log" Mar 20 11:49:54 crc kubenswrapper[4860]: I0320 11:49:54.300943 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zrj5v_8d34a762-55ad-41cb-994e-d4707bfebe22/registry-server/0.log" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.194697 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566790-b9m2r"] Mar 20 11:50:00 crc kubenswrapper[4860]: E0320 11:50:00.195952 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f342006-66c4-4bc6-9577-1aa4db4b4210" containerName="oc" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.195967 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f342006-66c4-4bc6-9577-1aa4db4b4210" containerName="oc" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.196161 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f342006-66c4-4bc6-9577-1aa4db4b4210" containerName="oc" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.196859 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-b9m2r" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.201434 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.201917 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.203255 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.207434 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566790-b9m2r"] Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.314401 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6rqv\" (UniqueName: \"kubernetes.io/projected/c1d1e9de-fef8-4113-b404-ee02a79e962c-kube-api-access-c6rqv\") pod \"auto-csr-approver-29566790-b9m2r\" (UID: \"c1d1e9de-fef8-4113-b404-ee02a79e962c\") " pod="openshift-infra/auto-csr-approver-29566790-b9m2r" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.415526 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rqv\" (UniqueName: \"kubernetes.io/projected/c1d1e9de-fef8-4113-b404-ee02a79e962c-kube-api-access-c6rqv\") pod \"auto-csr-approver-29566790-b9m2r\" (UID: \"c1d1e9de-fef8-4113-b404-ee02a79e962c\") " pod="openshift-infra/auto-csr-approver-29566790-b9m2r" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.438617 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6rqv\" (UniqueName: \"kubernetes.io/projected/c1d1e9de-fef8-4113-b404-ee02a79e962c-kube-api-access-c6rqv\") pod \"auto-csr-approver-29566790-b9m2r\" (UID: \"c1d1e9de-fef8-4113-b404-ee02a79e962c\") " pod="openshift-infra/auto-csr-approver-29566790-b9m2r" Mar 20 11:50:00 crc kubenswrapper[4860]: I0320 11:50:00.545265 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-b9m2r" Mar 20 11:50:01 crc kubenswrapper[4860]: I0320 11:50:01.017519 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566790-b9m2r"] Mar 20 11:50:01 crc kubenswrapper[4860]: W0320 11:50:01.026473 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1d1e9de_fef8_4113_b404_ee02a79e962c.slice/crio-db7ef98fd469aaf652f9b02b2dd4e904907cd1ccf59d1e39fb8cd9340da628c6 WatchSource:0}: Error finding container db7ef98fd469aaf652f9b02b2dd4e904907cd1ccf59d1e39fb8cd9340da628c6: Status 404 returned error can't find the container with id db7ef98fd469aaf652f9b02b2dd4e904907cd1ccf59d1e39fb8cd9340da628c6 Mar 20 11:50:01 crc kubenswrapper[4860]: I0320 11:50:01.588463 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-b9m2r" event={"ID":"c1d1e9de-fef8-4113-b404-ee02a79e962c","Type":"ContainerStarted","Data":"db7ef98fd469aaf652f9b02b2dd4e904907cd1ccf59d1e39fb8cd9340da628c6"} Mar 20 11:50:03 crc kubenswrapper[4860]: I0320 11:50:03.612970 4860 generic.go:334] "Generic (PLEG): container finished" podID="c1d1e9de-fef8-4113-b404-ee02a79e962c" containerID="ff3cd657f654c31af8a42160cee2d2823ab955307f4eefc8a66693a18aebac08" exitCode=0 Mar 20 11:50:03 crc kubenswrapper[4860]: I0320 11:50:03.613063 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-b9m2r" event={"ID":"c1d1e9de-fef8-4113-b404-ee02a79e962c","Type":"ContainerDied","Data":"ff3cd657f654c31af8a42160cee2d2823ab955307f4eefc8a66693a18aebac08"} Mar 20 11:50:04 crc kubenswrapper[4860]: I0320 11:50:04.920348 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-b9m2r" Mar 20 11:50:05 crc kubenswrapper[4860]: I0320 11:50:05.003141 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6rqv\" (UniqueName: \"kubernetes.io/projected/c1d1e9de-fef8-4113-b404-ee02a79e962c-kube-api-access-c6rqv\") pod \"c1d1e9de-fef8-4113-b404-ee02a79e962c\" (UID: \"c1d1e9de-fef8-4113-b404-ee02a79e962c\") " Mar 20 11:50:05 crc kubenswrapper[4860]: I0320 11:50:05.015517 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1d1e9de-fef8-4113-b404-ee02a79e962c-kube-api-access-c6rqv" (OuterVolumeSpecName: "kube-api-access-c6rqv") pod "c1d1e9de-fef8-4113-b404-ee02a79e962c" (UID: "c1d1e9de-fef8-4113-b404-ee02a79e962c"). InnerVolumeSpecName "kube-api-access-c6rqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:50:05 crc kubenswrapper[4860]: I0320 11:50:05.105648 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6rqv\" (UniqueName: \"kubernetes.io/projected/c1d1e9de-fef8-4113-b404-ee02a79e962c-kube-api-access-c6rqv\") on node \"crc\" DevicePath \"\"" Mar 20 11:50:05 crc kubenswrapper[4860]: I0320 11:50:05.630793 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-b9m2r" event={"ID":"c1d1e9de-fef8-4113-b404-ee02a79e962c","Type":"ContainerDied","Data":"db7ef98fd469aaf652f9b02b2dd4e904907cd1ccf59d1e39fb8cd9340da628c6"} Mar 20 11:50:05 crc kubenswrapper[4860]: I0320 11:50:05.630880 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db7ef98fd469aaf652f9b02b2dd4e904907cd1ccf59d1e39fb8cd9340da628c6" Mar 20 11:50:05 crc kubenswrapper[4860]: I0320 11:50:05.630893 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-b9m2r" Mar 20 11:50:06 crc kubenswrapper[4860]: I0320 11:50:06.038826 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-k8pc4"] Mar 20 11:50:06 crc kubenswrapper[4860]: I0320 11:50:06.057505 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-k8pc4"] Mar 20 11:50:07 crc kubenswrapper[4860]: I0320 11:50:07.424462 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d3485ef-9918-4aa6-80d1-c1c295d46ebe" path="/var/lib/kubelet/pods/1d3485ef-9918-4aa6-80d1-c1c295d46ebe/volumes" Mar 20 11:50:49 crc kubenswrapper[4860]: I0320 11:50:49.690303 4860 scope.go:117] "RemoveContainer" containerID="7c62f1c8ef0515a28ab838f145210c3776f9c242b812e79c909a339bcd0bc452" Mar 20 11:51:17 crc kubenswrapper[4860]: I0320 11:51:17.242260 4860 generic.go:334] "Generic (PLEG): container finished" podID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerID="fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b" exitCode=0 Mar 20 11:51:17 crc kubenswrapper[4860]: I0320 11:51:17.242341 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-972zc/must-gather-g5zkg" event={"ID":"537f47a7-01d4-449a-8afc-a83a212f4bc5","Type":"ContainerDied","Data":"fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b"} Mar 20 11:51:17 crc kubenswrapper[4860]: I0320 11:51:17.243914 4860 scope.go:117] "RemoveContainer" containerID="fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b" Mar 20 11:51:18 crc kubenswrapper[4860]: I0320 11:51:18.045363 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-972zc_must-gather-g5zkg_537f47a7-01d4-449a-8afc-a83a212f4bc5/gather/0.log" Mar 20 11:51:22 crc kubenswrapper[4860]: I0320 11:51:22.345019 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:51:22 crc kubenswrapper[4860]: I0320 11:51:22.345544 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.230632 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-972zc/must-gather-g5zkg"] Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.231740 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-972zc/must-gather-g5zkg" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerName="copy" containerID="cri-o://e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba" gracePeriod=2 Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.237340 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-972zc/must-gather-g5zkg"] Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.728010 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-972zc_must-gather-g5zkg_537f47a7-01d4-449a-8afc-a83a212f4bc5/copy/0.log" Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.728995 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.823002 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/537f47a7-01d4-449a-8afc-a83a212f4bc5-must-gather-output\") pod \"537f47a7-01d4-449a-8afc-a83a212f4bc5\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.823065 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbg2s\" (UniqueName: \"kubernetes.io/projected/537f47a7-01d4-449a-8afc-a83a212f4bc5-kube-api-access-dbg2s\") pod \"537f47a7-01d4-449a-8afc-a83a212f4bc5\" (UID: \"537f47a7-01d4-449a-8afc-a83a212f4bc5\") " Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.831697 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/537f47a7-01d4-449a-8afc-a83a212f4bc5-kube-api-access-dbg2s" (OuterVolumeSpecName: "kube-api-access-dbg2s") pod "537f47a7-01d4-449a-8afc-a83a212f4bc5" (UID: "537f47a7-01d4-449a-8afc-a83a212f4bc5"). InnerVolumeSpecName "kube-api-access-dbg2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.925184 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbg2s\" (UniqueName: \"kubernetes.io/projected/537f47a7-01d4-449a-8afc-a83a212f4bc5-kube-api-access-dbg2s\") on node \"crc\" DevicePath \"\"" Mar 20 11:51:26 crc kubenswrapper[4860]: I0320 11:51:26.925197 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/537f47a7-01d4-449a-8afc-a83a212f4bc5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "537f47a7-01d4-449a-8afc-a83a212f4bc5" (UID: "537f47a7-01d4-449a-8afc-a83a212f4bc5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.026597 4860 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/537f47a7-01d4-449a-8afc-a83a212f4bc5-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.328932 4860 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-972zc_must-gather-g5zkg_537f47a7-01d4-449a-8afc-a83a212f4bc5/copy/0.log" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.331493 4860 generic.go:334] "Generic (PLEG): container finished" podID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerID="e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba" exitCode=143 Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.331639 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-972zc/must-gather-g5zkg" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.331577 4860 scope.go:117] "RemoveContainer" containerID="e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.356318 4860 scope.go:117] "RemoveContainer" containerID="fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.420079 4860 scope.go:117] "RemoveContainer" containerID="e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba" Mar 20 11:51:27 crc kubenswrapper[4860]: E0320 11:51:27.420451 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba\": container with ID starting with e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba not found: ID does not exist" containerID="e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.420479 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba"} err="failed to get container status \"e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba\": rpc error: code = NotFound desc = could not find container \"e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba\": container with ID starting with e02bf15081b9372bf7659815dbcb88502421613acc550737bba159008d91b9ba not found: ID does not exist" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.420500 4860 scope.go:117] "RemoveContainer" containerID="fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.423294 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" path="/var/lib/kubelet/pods/537f47a7-01d4-449a-8afc-a83a212f4bc5/volumes" Mar 20 11:51:27 crc kubenswrapper[4860]: E0320 11:51:27.423341 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b\": container with ID starting with fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b not found: ID does not exist" containerID="fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b" Mar 20 11:51:27 crc kubenswrapper[4860]: I0320 11:51:27.423375 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b"} err="failed to get container status \"fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b\": rpc error: code = NotFound desc = could not find container \"fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b\": container with ID starting with fea1bff12d057f80685637551a80eabc97dbecd983b9b59ee632515897ce584b not found: ID does not exist" Mar 20 11:51:52 crc kubenswrapper[4860]: I0320 11:51:52.346375 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:51:52 crc kubenswrapper[4860]: I0320 11:51:52.347340 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.053928 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wrg5v"] Mar 20 11:51:57 crc kubenswrapper[4860]: E0320 11:51:57.055408 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d1e9de-fef8-4113-b404-ee02a79e962c" containerName="oc" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.055429 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d1e9de-fef8-4113-b404-ee02a79e962c" containerName="oc" Mar 20 11:51:57 crc kubenswrapper[4860]: E0320 11:51:57.055465 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerName="gather" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.055474 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerName="gather" Mar 20 11:51:57 crc kubenswrapper[4860]: E0320 11:51:57.055495 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerName="copy" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.055502 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerName="copy" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.055729 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerName="copy" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.055746 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="537f47a7-01d4-449a-8afc-a83a212f4bc5" containerName="gather" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.055757 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1d1e9de-fef8-4113-b404-ee02a79e962c" containerName="oc" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.057136 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.067689 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wrg5v"] Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.169238 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-utilities\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.169710 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-catalog-content\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.169856 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmb74\" (UniqueName: \"kubernetes.io/projected/60759547-601f-4452-b887-94820dba6b6c-kube-api-access-vmb74\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.271551 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-catalog-content\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.272008 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmb74\" (UniqueName: \"kubernetes.io/projected/60759547-601f-4452-b887-94820dba6b6c-kube-api-access-vmb74\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.272290 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-utilities\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.272450 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-catalog-content\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.272900 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-utilities\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.296126 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmb74\" (UniqueName: \"kubernetes.io/projected/60759547-601f-4452-b887-94820dba6b6c-kube-api-access-vmb74\") pod \"community-operators-wrg5v\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.398471 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:51:57 crc kubenswrapper[4860]: I0320 11:51:57.918766 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wrg5v"] Mar 20 11:51:58 crc kubenswrapper[4860]: I0320 11:51:58.596018 4860 generic.go:334] "Generic (PLEG): container finished" podID="60759547-601f-4452-b887-94820dba6b6c" containerID="eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34" exitCode=0 Mar 20 11:51:58 crc kubenswrapper[4860]: I0320 11:51:58.596090 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrg5v" event={"ID":"60759547-601f-4452-b887-94820dba6b6c","Type":"ContainerDied","Data":"eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34"} Mar 20 11:51:58 crc kubenswrapper[4860]: I0320 11:51:58.596647 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrg5v" event={"ID":"60759547-601f-4452-b887-94820dba6b6c","Type":"ContainerStarted","Data":"76d0ff7a0d4f9572fcd6864eccc27b2ca9db2cb75e81302c9cd1d8a7f3b9b3fd"} Mar 20 11:51:58 crc kubenswrapper[4860]: I0320 11:51:58.599300 4860 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:51:59 crc kubenswrapper[4860]: I0320 11:51:59.608866 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrg5v" event={"ID":"60759547-601f-4452-b887-94820dba6b6c","Type":"ContainerStarted","Data":"c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df"} Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.151737 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566792-ggljw"] Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.152958 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-ggljw" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.157706 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.157809 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.157978 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.162474 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566792-ggljw"] Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.324131 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72mh\" (UniqueName: \"kubernetes.io/projected/2a33a045-cd5c-4295-8afd-92b36e24a572-kube-api-access-f72mh\") pod \"auto-csr-approver-29566792-ggljw\" (UID: \"2a33a045-cd5c-4295-8afd-92b36e24a572\") " pod="openshift-infra/auto-csr-approver-29566792-ggljw" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.425424 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72mh\" (UniqueName: \"kubernetes.io/projected/2a33a045-cd5c-4295-8afd-92b36e24a572-kube-api-access-f72mh\") pod \"auto-csr-approver-29566792-ggljw\" (UID: \"2a33a045-cd5c-4295-8afd-92b36e24a572\") " pod="openshift-infra/auto-csr-approver-29566792-ggljw" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.450387 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72mh\" (UniqueName: \"kubernetes.io/projected/2a33a045-cd5c-4295-8afd-92b36e24a572-kube-api-access-f72mh\") pod \"auto-csr-approver-29566792-ggljw\" (UID: \"2a33a045-cd5c-4295-8afd-92b36e24a572\") " pod="openshift-infra/auto-csr-approver-29566792-ggljw" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.520696 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-ggljw" Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.648643 4860 generic.go:334] "Generic (PLEG): container finished" podID="60759547-601f-4452-b887-94820dba6b6c" containerID="c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df" exitCode=0 Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.648893 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrg5v" event={"ID":"60759547-601f-4452-b887-94820dba6b6c","Type":"ContainerDied","Data":"c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df"} Mar 20 11:52:00 crc kubenswrapper[4860]: I0320 11:52:00.976920 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566792-ggljw"] Mar 20 11:52:01 crc kubenswrapper[4860]: I0320 11:52:01.663671 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrg5v" event={"ID":"60759547-601f-4452-b887-94820dba6b6c","Type":"ContainerStarted","Data":"22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf"} Mar 20 11:52:01 crc kubenswrapper[4860]: I0320 11:52:01.665589 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566792-ggljw" event={"ID":"2a33a045-cd5c-4295-8afd-92b36e24a572","Type":"ContainerStarted","Data":"ef15e4d74cef2060c0738f0bd14f8359d3e53b0bfbfceae0b43ca70ef203bdf7"} Mar 20 11:52:01 crc kubenswrapper[4860]: I0320 11:52:01.705529 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wrg5v" podStartSLOduration=2.197807248 podStartE2EDuration="4.705491877s" podCreationTimestamp="2026-03-20 11:51:57 +0000 UTC" firstStartedPulling="2026-03-20 11:51:58.598896533 +0000 UTC m=+3442.820257431" lastFinishedPulling="2026-03-20 11:52:01.106581162 +0000 UTC m=+3445.327942060" observedRunningTime="2026-03-20 11:52:01.695069526 +0000 UTC m=+3445.916430414" watchObservedRunningTime="2026-03-20 11:52:01.705491877 +0000 UTC m=+3445.926852775" Mar 20 11:52:02 crc kubenswrapper[4860]: I0320 11:52:02.675010 4860 generic.go:334] "Generic (PLEG): container finished" podID="2a33a045-cd5c-4295-8afd-92b36e24a572" containerID="8536cbf5e2641eae0dadbfbbe6f7cd6043ca90153b3252b63e0cfc33da21beb0" exitCode=0 Mar 20 11:52:02 crc kubenswrapper[4860]: I0320 11:52:02.675076 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566792-ggljw" event={"ID":"2a33a045-cd5c-4295-8afd-92b36e24a572","Type":"ContainerDied","Data":"8536cbf5e2641eae0dadbfbbe6f7cd6043ca90153b3252b63e0cfc33da21beb0"} Mar 20 11:52:04 crc kubenswrapper[4860]: I0320 11:52:04.036006 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-ggljw" Mar 20 11:52:04 crc kubenswrapper[4860]: I0320 11:52:04.185996 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f72mh\" (UniqueName: \"kubernetes.io/projected/2a33a045-cd5c-4295-8afd-92b36e24a572-kube-api-access-f72mh\") pod \"2a33a045-cd5c-4295-8afd-92b36e24a572\" (UID: \"2a33a045-cd5c-4295-8afd-92b36e24a572\") " Mar 20 11:52:04 crc kubenswrapper[4860]: I0320 11:52:04.200476 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a33a045-cd5c-4295-8afd-92b36e24a572-kube-api-access-f72mh" (OuterVolumeSpecName: "kube-api-access-f72mh") pod "2a33a045-cd5c-4295-8afd-92b36e24a572" (UID: "2a33a045-cd5c-4295-8afd-92b36e24a572"). InnerVolumeSpecName "kube-api-access-f72mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:52:04 crc kubenswrapper[4860]: I0320 11:52:04.287926 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f72mh\" (UniqueName: \"kubernetes.io/projected/2a33a045-cd5c-4295-8afd-92b36e24a572-kube-api-access-f72mh\") on node \"crc\" DevicePath \"\"" Mar 20 11:52:04 crc kubenswrapper[4860]: I0320 11:52:04.692372 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566792-ggljw" event={"ID":"2a33a045-cd5c-4295-8afd-92b36e24a572","Type":"ContainerDied","Data":"ef15e4d74cef2060c0738f0bd14f8359d3e53b0bfbfceae0b43ca70ef203bdf7"} Mar 20 11:52:04 crc kubenswrapper[4860]: I0320 11:52:04.692769 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef15e4d74cef2060c0738f0bd14f8359d3e53b0bfbfceae0b43ca70ef203bdf7" Mar 20 11:52:04 crc kubenswrapper[4860]: I0320 11:52:04.692451 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-ggljw" Mar 20 11:52:05 crc kubenswrapper[4860]: I0320 11:52:05.127510 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-6kgpn"] Mar 20 11:52:05 crc kubenswrapper[4860]: I0320 11:52:05.134312 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-6kgpn"] Mar 20 11:52:05 crc kubenswrapper[4860]: I0320 11:52:05.425086 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a69dccb3-324d-47b3-92d0-af9fc224932d" path="/var/lib/kubelet/pods/a69dccb3-324d-47b3-92d0-af9fc224932d/volumes" Mar 20 11:52:07 crc kubenswrapper[4860]: I0320 11:52:07.398981 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:52:07 crc kubenswrapper[4860]: I0320 11:52:07.399483 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:52:07 crc kubenswrapper[4860]: I0320 11:52:07.442811 4860 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:52:07 crc kubenswrapper[4860]: I0320 11:52:07.764533 4860 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:52:07 crc kubenswrapper[4860]: I0320 11:52:07.824931 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wrg5v"] Mar 20 11:52:09 crc kubenswrapper[4860]: I0320 11:52:09.728141 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wrg5v" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="registry-server" containerID="cri-o://22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf" gracePeriod=2 Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.204737 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.388408 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmb74\" (UniqueName: \"kubernetes.io/projected/60759547-601f-4452-b887-94820dba6b6c-kube-api-access-vmb74\") pod \"60759547-601f-4452-b887-94820dba6b6c\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.388569 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-catalog-content\") pod \"60759547-601f-4452-b887-94820dba6b6c\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.388611 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-utilities\") pod \"60759547-601f-4452-b887-94820dba6b6c\" (UID: \"60759547-601f-4452-b887-94820dba6b6c\") " Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.389911 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-utilities" (OuterVolumeSpecName: "utilities") pod "60759547-601f-4452-b887-94820dba6b6c" (UID: "60759547-601f-4452-b887-94820dba6b6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.398448 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60759547-601f-4452-b887-94820dba6b6c-kube-api-access-vmb74" (OuterVolumeSpecName: "kube-api-access-vmb74") pod "60759547-601f-4452-b887-94820dba6b6c" (UID: "60759547-601f-4452-b887-94820dba6b6c"). InnerVolumeSpecName "kube-api-access-vmb74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.446179 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60759547-601f-4452-b887-94820dba6b6c" (UID: "60759547-601f-4452-b887-94820dba6b6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.490746 4860 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.490830 4860 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60759547-601f-4452-b887-94820dba6b6c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.490844 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmb74\" (UniqueName: \"kubernetes.io/projected/60759547-601f-4452-b887-94820dba6b6c-kube-api-access-vmb74\") on node \"crc\" DevicePath \"\"" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.747048 4860 generic.go:334] "Generic (PLEG): container finished" podID="60759547-601f-4452-b887-94820dba6b6c" containerID="22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf" exitCode=0 Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.747102 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrg5v" event={"ID":"60759547-601f-4452-b887-94820dba6b6c","Type":"ContainerDied","Data":"22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf"} Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.747138 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrg5v" event={"ID":"60759547-601f-4452-b887-94820dba6b6c","Type":"ContainerDied","Data":"76d0ff7a0d4f9572fcd6864eccc27b2ca9db2cb75e81302c9cd1d8a7f3b9b3fd"} Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.747157 4860 scope.go:117] "RemoveContainer" containerID="22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.747338 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrg5v" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.771596 4860 scope.go:117] "RemoveContainer" containerID="c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.786651 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wrg5v"] Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.793719 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wrg5v"] Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.812460 4860 scope.go:117] "RemoveContainer" containerID="eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.842934 4860 scope.go:117] "RemoveContainer" containerID="22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf" Mar 20 11:52:10 crc kubenswrapper[4860]: E0320 11:52:10.843718 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf\": container with ID starting with 22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf not found: ID does not exist" containerID="22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.843809 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf"} err="failed to get container status \"22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf\": rpc error: code = NotFound desc = could not find container \"22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf\": container with ID starting with 22f0903b56f3bfa8dd880f9797a3c8fb10a4601857b23afae60154832f57e6bf not found: ID does not exist" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.843863 4860 scope.go:117] "RemoveContainer" containerID="c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df" Mar 20 11:52:10 crc kubenswrapper[4860]: E0320 11:52:10.844614 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df\": container with ID starting with c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df not found: ID does not exist" containerID="c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.844648 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df"} err="failed to get container status \"c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df\": rpc error: code = NotFound desc = could not find container \"c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df\": container with ID starting with c93f9f1495b98e971e5c8db073e3b1aeb25deb749f36896b3b2f29df04e524df not found: ID does not exist" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.844665 4860 scope.go:117] "RemoveContainer" containerID="eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34" Mar 20 11:52:10 crc kubenswrapper[4860]: E0320 11:52:10.845161 4860 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34\": container with ID starting with eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34 not found: ID does not exist" containerID="eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34" Mar 20 11:52:10 crc kubenswrapper[4860]: I0320 11:52:10.845209 4860 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34"} err="failed to get container status \"eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34\": rpc error: code = NotFound desc = could not find container \"eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34\": container with ID starting with eec0d12b888c2750e2f84d2681310230377b0ba8dfbc189490e049f7b02b5b34 not found: ID does not exist" Mar 20 11:52:11 crc kubenswrapper[4860]: I0320 11:52:11.435367 4860 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60759547-601f-4452-b887-94820dba6b6c" path="/var/lib/kubelet/pods/60759547-601f-4452-b887-94820dba6b6c/volumes" Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.344363 4860 patch_prober.go:28] interesting pod/machine-config-daemon-kvdqp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.345105 4860 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.345166 4860 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.346062 4860 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27de1546d8fc12cd142537f3e435648e650a77886fa9d1677aedbb1fcd90c4bd"} pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.346142 4860 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" podUID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerName="machine-config-daemon" containerID="cri-o://27de1546d8fc12cd142537f3e435648e650a77886fa9d1677aedbb1fcd90c4bd" gracePeriod=600 Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.866450 4860 generic.go:334] "Generic (PLEG): container finished" podID="6a9df230-75a1-4b64-8d00-c179e9c19080" containerID="27de1546d8fc12cd142537f3e435648e650a77886fa9d1677aedbb1fcd90c4bd" exitCode=0 Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.866533 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerDied","Data":"27de1546d8fc12cd142537f3e435648e650a77886fa9d1677aedbb1fcd90c4bd"} Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.866953 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdqp" event={"ID":"6a9df230-75a1-4b64-8d00-c179e9c19080","Type":"ContainerStarted","Data":"c3a6e4b824ce80b190435234a888f23469947223fd7d2c0597395741b3d52f34"} Mar 20 11:52:22 crc kubenswrapper[4860]: I0320 11:52:22.866986 4860 scope.go:117] "RemoveContainer" containerID="2c6cd87f8a77136a1839b5f761bdf9d0b06d37fb07e23d5035e6ffed2dcf296d" Mar 20 11:52:49 crc kubenswrapper[4860]: I0320 11:52:49.840847 4860 scope.go:117] "RemoveContainer" containerID="02ca8def8758e2ad1b605230bcb844ea2d285141ff0c9b3e5a91bad1e50bf67e" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.165925 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566794-qpsdl"] Mar 20 11:54:00 crc kubenswrapper[4860]: E0320 11:54:00.166902 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="extract-utilities" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.166917 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="extract-utilities" Mar 20 11:54:00 crc kubenswrapper[4860]: E0320 11:54:00.166926 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a33a045-cd5c-4295-8afd-92b36e24a572" containerName="oc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.166933 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a33a045-cd5c-4295-8afd-92b36e24a572" containerName="oc" Mar 20 11:54:00 crc kubenswrapper[4860]: E0320 11:54:00.166953 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="extract-content" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.166961 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="extract-content" Mar 20 11:54:00 crc kubenswrapper[4860]: E0320 11:54:00.166987 4860 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="registry-server" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.166995 4860 state_mem.go:107] "Deleted CPUSet assignment" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="registry-server" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.167157 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a33a045-cd5c-4295-8afd-92b36e24a572" containerName="oc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.167172 4860 memory_manager.go:354] "RemoveStaleState removing state" podUID="60759547-601f-4452-b887-94820dba6b6c" containerName="registry-server" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.167666 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566794-qpsdl" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.177775 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566794-qpsdl"] Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.179048 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.180715 4860 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-82p2f" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.187056 4860 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.277037 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btg9p\" (UniqueName: \"kubernetes.io/projected/1babb09d-4647-48e8-bb5a-ea00aa1e0a89-kube-api-access-btg9p\") pod \"auto-csr-approver-29566794-qpsdl\" (UID: \"1babb09d-4647-48e8-bb5a-ea00aa1e0a89\") " pod="openshift-infra/auto-csr-approver-29566794-qpsdl" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.378983 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btg9p\" (UniqueName: \"kubernetes.io/projected/1babb09d-4647-48e8-bb5a-ea00aa1e0a89-kube-api-access-btg9p\") pod \"auto-csr-approver-29566794-qpsdl\" (UID: \"1babb09d-4647-48e8-bb5a-ea00aa1e0a89\") " pod="openshift-infra/auto-csr-approver-29566794-qpsdl" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.406972 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btg9p\" (UniqueName: \"kubernetes.io/projected/1babb09d-4647-48e8-bb5a-ea00aa1e0a89-kube-api-access-btg9p\") pod \"auto-csr-approver-29566794-qpsdl\" (UID: \"1babb09d-4647-48e8-bb5a-ea00aa1e0a89\") " pod="openshift-infra/auto-csr-approver-29566794-qpsdl" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.421488 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6pxcc"] Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.425015 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.442862 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pxcc"] Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.481517 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-catalog-content\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.481601 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhtm5\" (UniqueName: \"kubernetes.io/projected/68089545-e05b-4352-b47d-37ad7ae7bd55-kube-api-access-hhtm5\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.481637 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-utilities\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.492596 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566794-qpsdl" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.582946 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-catalog-content\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.583033 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhtm5\" (UniqueName: \"kubernetes.io/projected/68089545-e05b-4352-b47d-37ad7ae7bd55-kube-api-access-hhtm5\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.583080 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-utilities\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.583824 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-utilities\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.583927 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68089545-e05b-4352-b47d-37ad7ae7bd55-catalog-content\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.616435 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhtm5\" (UniqueName: \"kubernetes.io/projected/68089545-e05b-4352-b47d-37ad7ae7bd55-kube-api-access-hhtm5\") pod \"redhat-marketplace-6pxcc\" (UID: \"68089545-e05b-4352-b47d-37ad7ae7bd55\") " pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:00 crc kubenswrapper[4860]: I0320 11:54:00.762263 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pxcc" Mar 20 11:54:01 crc kubenswrapper[4860]: I0320 11:54:01.025543 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566794-qpsdl"] Mar 20 11:54:01 crc kubenswrapper[4860]: I0320 11:54:01.096908 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pxcc"] Mar 20 11:54:01 crc kubenswrapper[4860]: W0320 11:54:01.107744 4860 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68089545_e05b_4352_b47d_37ad7ae7bd55.slice/crio-49d8678e90a394cc649a9253515242549bd94aa6fb2157d46a392d81d8ef553e WatchSource:0}: Error finding container 49d8678e90a394cc649a9253515242549bd94aa6fb2157d46a392d81d8ef553e: Status 404 returned error can't find the container with id 49d8678e90a394cc649a9253515242549bd94aa6fb2157d46a392d81d8ef553e Mar 20 11:54:01 crc kubenswrapper[4860]: I0320 11:54:01.880185 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566794-qpsdl" event={"ID":"1babb09d-4647-48e8-bb5a-ea00aa1e0a89","Type":"ContainerStarted","Data":"75db89442dea1a5e7a6fab371ccd07d57c6d23c7253c6c905c2a63c9c5f30a92"} Mar 20 11:54:01 crc kubenswrapper[4860]: I0320 11:54:01.884343 4860 generic.go:334] "Generic (PLEG): container finished" podID="68089545-e05b-4352-b47d-37ad7ae7bd55" containerID="b3bbe945735b107b64688b96a18e24f6c51fd781e6eccc610b43833b493bb15f" exitCode=0 Mar 20 11:54:01 crc kubenswrapper[4860]: I0320 11:54:01.884418 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pxcc" event={"ID":"68089545-e05b-4352-b47d-37ad7ae7bd55","Type":"ContainerDied","Data":"b3bbe945735b107b64688b96a18e24f6c51fd781e6eccc610b43833b493bb15f"} Mar 20 11:54:01 crc kubenswrapper[4860]: I0320 11:54:01.884462 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pxcc" event={"ID":"68089545-e05b-4352-b47d-37ad7ae7bd55","Type":"ContainerStarted","Data":"49d8678e90a394cc649a9253515242549bd94aa6fb2157d46a392d81d8ef553e"} Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.816180 4860 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fdtds"] Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.818370 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.836288 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fdtds"] Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.908492 4860 generic.go:334] "Generic (PLEG): container finished" podID="1babb09d-4647-48e8-bb5a-ea00aa1e0a89" containerID="027bfeeca9ef5489ad1ac6e60f7135ea097246b42bf09aa60991a6ad1192d512" exitCode=0 Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.908624 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566794-qpsdl" event={"ID":"1babb09d-4647-48e8-bb5a-ea00aa1e0a89","Type":"ContainerDied","Data":"027bfeeca9ef5489ad1ac6e60f7135ea097246b42bf09aa60991a6ad1192d512"} Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.913755 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pxcc" event={"ID":"68089545-e05b-4352-b47d-37ad7ae7bd55","Type":"ContainerStarted","Data":"484dcfcf91015530ae85e0ef6e791b5847b29e3a4667cc1d3c55be76a269f483"} Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.926707 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-utilities\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.927108 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q76sc\" (UniqueName: \"kubernetes.io/projected/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-kube-api-access-q76sc\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:02 crc kubenswrapper[4860]: I0320 11:54:02.927286 4860 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-catalog-content\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.030444 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-utilities\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.030549 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q76sc\" (UniqueName: \"kubernetes.io/projected/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-kube-api-access-q76sc\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.030585 4860 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-catalog-content\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.031428 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-catalog-content\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.031931 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-utilities\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.066671 4860 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q76sc\" (UniqueName: \"kubernetes.io/projected/704d95d6-aca3-4174-b2ac-985b2bfbeb5d-kube-api-access-q76sc\") pod \"redhat-operators-fdtds\" (UID: \"704d95d6-aca3-4174-b2ac-985b2bfbeb5d\") " pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.137157 4860 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdtds" Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.728520 4860 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fdtds"] Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.925454 4860 generic.go:334] "Generic (PLEG): container finished" podID="68089545-e05b-4352-b47d-37ad7ae7bd55" containerID="484dcfcf91015530ae85e0ef6e791b5847b29e3a4667cc1d3c55be76a269f483" exitCode=0 Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.925551 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pxcc" event={"ID":"68089545-e05b-4352-b47d-37ad7ae7bd55","Type":"ContainerDied","Data":"484dcfcf91015530ae85e0ef6e791b5847b29e3a4667cc1d3c55be76a269f483"} Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.929180 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdtds" event={"ID":"704d95d6-aca3-4174-b2ac-985b2bfbeb5d","Type":"ContainerStarted","Data":"330d366d0bd98890897cef9f455199916baf9480a0d51d16ee738892098bbac5"} Mar 20 11:54:03 crc kubenswrapper[4860]: I0320 11:54:03.929249 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdtds" event={"ID":"704d95d6-aca3-4174-b2ac-985b2bfbeb5d","Type":"ContainerStarted","Data":"a4f9dcd43db44e4ce7dee893eb45c95c802e80f34966961a8a158af1d1e14255"} Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.314690 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566794-qpsdl" Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.481523 4860 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btg9p\" (UniqueName: \"kubernetes.io/projected/1babb09d-4647-48e8-bb5a-ea00aa1e0a89-kube-api-access-btg9p\") pod \"1babb09d-4647-48e8-bb5a-ea00aa1e0a89\" (UID: \"1babb09d-4647-48e8-bb5a-ea00aa1e0a89\") " Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.489476 4860 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1babb09d-4647-48e8-bb5a-ea00aa1e0a89-kube-api-access-btg9p" (OuterVolumeSpecName: "kube-api-access-btg9p") pod "1babb09d-4647-48e8-bb5a-ea00aa1e0a89" (UID: "1babb09d-4647-48e8-bb5a-ea00aa1e0a89"). InnerVolumeSpecName "kube-api-access-btg9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.583452 4860 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btg9p\" (UniqueName: \"kubernetes.io/projected/1babb09d-4647-48e8-bb5a-ea00aa1e0a89-kube-api-access-btg9p\") on node \"crc\" DevicePath \"\"" Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.948061 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566794-qpsdl" event={"ID":"1babb09d-4647-48e8-bb5a-ea00aa1e0a89","Type":"ContainerDied","Data":"75db89442dea1a5e7a6fab371ccd07d57c6d23c7253c6c905c2a63c9c5f30a92"} Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.948630 4860 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75db89442dea1a5e7a6fab371ccd07d57c6d23c7253c6c905c2a63c9c5f30a92" Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.948731 4860 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566794-qpsdl" Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.959054 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pxcc" event={"ID":"68089545-e05b-4352-b47d-37ad7ae7bd55","Type":"ContainerStarted","Data":"884fde30a6c7af9672f286b0c571a0af93825abe1de41d97152f174ee8dc4ad0"} Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.962018 4860 generic.go:334] "Generic (PLEG): container finished" podID="704d95d6-aca3-4174-b2ac-985b2bfbeb5d" containerID="330d366d0bd98890897cef9f455199916baf9480a0d51d16ee738892098bbac5" exitCode=0 Mar 20 11:54:04 crc kubenswrapper[4860]: I0320 11:54:04.962089 4860 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdtds" event={"ID":"704d95d6-aca3-4174-b2ac-985b2bfbeb5d","Type":"ContainerDied","Data":"330d366d0bd98890897cef9f455199916baf9480a0d51d16ee738892098bbac5"} Mar 20 11:54:05 crc kubenswrapper[4860]: I0320 11:54:05.006960 4860 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6pxcc" podStartSLOduration=2.214376319 podStartE2EDuration="5.006251899s" podCreationTimestamp="2026-03-20 11:54:00 +0000 UTC" firstStartedPulling="2026-03-20 11:54:01.887191797 +0000 UTC m=+3566.108552695" lastFinishedPulling="2026-03-20 11:54:04.679067377 +0000 UTC m=+3568.900428275" observedRunningTime="2026-03-20 11:54:05.002048845 +0000 UTC m=+3569.223409743" watchObservedRunningTime="2026-03-20 11:54:05.006251899 +0000 UTC m=+3569.227612807" Mar 20 11:54:05 crc kubenswrapper[4860]: I0320 11:54:05.438693 4860 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-dlb4m"] Mar 20 11:54:05 crc kubenswrapper[4860]: I0320 11:54:05.440730 4860 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-dlb4m"]